
9.5 Genetic Algorithm Variants 157
is facilitated. Different schedules can be used to reduce the mutation rate. The
schedule above results in an exponential decrease. An alternative may be to use a
linear schedule, which will result in a slower decrease in p
m
, allowing more exploration.
However, a slower decrease may be too disruptive for already found good solutions.
A good strategy is to base the probability of being mutated on the fitness of the
individual: the more fit the individual is, the lower the probability that its genes will
be mutated; the more unfit the individual, the higher the probability of mutation.
Annealing schedules similar to those used for the learning rate of NNs (refer to equation
(4.40)), and to adjust control parameters for PSO and ACO can be applied to p
m
(also
refer to Section A.5.2).
For floating-point representations, performance is also influenced by the mutational
step sizes. An ideal strategy is to start with large mutational step sizes to allow larger,
stochastic jumps within the search space. The step sizes are then reduced over time,
so that very small changes result near the end of the search process. Step sizes can also
be proportional to the fitness of an individual, with unfit individuals having larger step
sizes than fit individuals. As an alternative to deterministic schedules to adapt step
sizes, self-adaptation strategies as for EP and ES can be used (refer to Sections 11.3.3
and 12.3.3).
The crossover rate, p
c
, also bears significant influence on performance. With its op-
timal value being problem dependent, the same adaptive strategies as for p
m
can be
used to dynamically adjust p
c
.
In addition to p
m
(and mutational step sizes in the case of floating-point represen-
tations) and p
c
, the choice of the best evolutionary operators to use is also problem
dependent. While a best combination of crossover, mutation, and selection operators
together with best values for the control parameters can be obtained via empirical stud-
ies, a number of adaptive methods can be found as reviewed in [41]. These methods
adaptively switch between different operators based on search progress. Ultimately,
finding the best set of operators and control parameter values is a multi-objective
optimization problem by itself.
9.5 Genetic Algorithm Variants
Based on the general GA, different implementations of a GA can be obtained by
using different combinations of selection, crossover, and mutation operators. Although
different operator combinations result in different behaviors, the same algorithmic flow
as given in Algorithm 8.1 is followed. This section discusses a few GA implementations
that deviate from the flow given in Algorithm 8.1. Section 9.5.1 discusses generation
gap methods. The messy GA is described in Section 9.5.2. A short discussion on
interactive evolution is given in Section 9.5.3. Island (or parallel) GAs are discussed
in Section 9.5.4.