Andrew Cannon Yuki Osada Angeline Honggowarsito. Contents What are Evolutionary Algorithms (EAs)?...
39
Evolutionary Algorithms Andrew Cannon Yuki Osada Angeline Honggowarsito
Andrew Cannon Yuki Osada Angeline Honggowarsito. Contents What are Evolutionary Algorithms (EAs)? Why are EAs Important? Categories of EAs Mutation Self
Contents What are Evolutionary Algorithms (EAs)? Why are EAs
Important? Categories of EAs Mutation Self Adaptation Recombination
Selection Application
Slide 3
Evolutionary Algorithms Search methods that mimic the process
of natural evolution Principle of Survival of the Fittest In each
generation, select the fittest parents Re-combine those parents to
produce new offspring Perform mutations on the new offspring
Slide 4
Why are EAs Important? Flexibility Adaptable Concept to
Problems Problem examples: Travelling Salesman, Knapsack, Trading
Prediction in Stock Market, etc. Algorithms to solve those problems
are either too specialised or too generalised
Slide 5
Categories of EAs Genetic Algorithms Evolutionary Strategies
Evolutionary Programming Genetic Programming More similarities than
differences
Slide 6
Genetic Algorithms In 1950s, Biologists used computers for
biological system simulation. First introduced in 1960s by John
Holland from the University of Michigan Modeling Adaptive Process
Designed to solve discrete/integer optimization problem
Slide 7
Genetic Algorithms Operate on Binary Strings, binary as the
representation of individuals Applying recombination operator with
mutation as background operator
Slide 8
Evolutionary Strategies First developed by Rechenberg in 1973
Solved parameter optimization problems Individuals represented as a
pair of float-valued vectors Apply both recombination and self
adaptive mutation
Slide 9
Evolutionary Strategies Similar to Genetic Algorithms in
recombination and mutation processes Differences with Genetic
Algorithms Evolutionary Strategies are better at finding local
maximum while Genetic Algorithms are more suitable at finding
global maximum Thus, Evolutionary Strategies are faster than
Genetic Algorithms Evolutionary Strategies are represented as real
number vector while GAs are represented using bitstrings
Slide 10
Evolutionary Programming Developed by Lawrence Fogel in 1962
Aimed at evolution of Artificial Intelligence in developing ability
to predict changes in environment Use Finite State Machine for
prediction
Slide 11
Evolutionary Programming Predict output of 011101 with initial
state C, produce output of 110111 No recombination Representation
based on real- valued vectors 1/1 0/0 0/c 0/1 1/1 1/0 A B C
Slide 12
Genetic Programming Developed by Koza to allow the program to
evolve by itself during the evolution process Individuals are
represented by Tree or Graphs
Slide 13
Genetic Programming Different from GA,ES,EP where
representation is linear (bit strings and real value vectors), Tree
is non-linear Size depend on Depth and Width, while other
representations have a fixed size Only requires crossover OR
mutation
Slide 14
Mutation Binary Mutation: Flipping the bits, as there are only
two states of binary values : 0 and 1 Mutating (0,1,0,0,1) will
produce (1,0,1,0,1)
Slide 15
Mutation Real Value Mutation: Randomly created value added to
the variables with some predefined mutation rate Mutation rate and
Mutation step need to be defined Mutation rate is inversely
proportional to the number of variables (dimensions)
Slide 16
Sources & References Eiben A.E 2004, What is Evolutionary
Algorithm, Available from:.[29 August 2012 ]. Michalewicz, Z.,
Hinterding, R., and Michalewicz, M., Evolutionary Algorithms,
Chapter 2 in Fuzzy Evolutionary Computation, W. Pedrycz (editor),
Kluwer Academic, 1997. Evolutionary Algorithms, T. Bck, U. Hammel,
and H.-P. Schwefel, Evolutionary computation: comments on the
history and current state, IEEE Transactions on Evolutionary
Computation 1(1), 1997 X. Yao, Evolutionary computation: a gentle
introduction, Evolutionary Optimization, 2002 Whitley, D 2001, An
Overview of Evolutionary Algorithm: Practical Issues and Common
Pitfalls, Information and Software Technology, vol.43, pp.
817-831
Slide 17
Self Adaptation Dont know what values to assign to parameters
so let them evolve! Population consisting of real vectors x = (x
1,x 2,,x n ) We produce offspring by adding random vectors to
them
Slide 18
Step Size Schwefel (1981): add vectors whose components are
Gaussian random variates with mean 0 What standard deviation should
be used? The standard deviation evolves as the algorithm is
running
Slide 19
Step Size Represent entities as (x,s) the individual itself (x)
and a step vector (s) We start by producing an offspring s from s:
s i = s i exp(cn -1/2 N(0,1) + dn -1/4 N i (0,1)) n = generation
number, c,d>0 constants We produce an offspring x from x: x i =
x i + s i N i (0,1)
Slide 20
Self Adaptation Other parameters can evolve using similar ideas
Sources: Bck T, Hammel, U & Schwefel, H-P 1997, Evolutionary
computation: comments on the history and current state, IEEE
Transactions on Evolutionary Computation, vol. 1, no. 1, pp. 3-17.
Available from: IEEE Xplore Digital Library [23 rd August 2012].
Beyer, HG 1995, Toward a Theory of Evolution Strategies:
Self-Adaptation, Evolutionary Computation, vol. 3, no. 3, pp.
311-348. Saravanan, N, Fogel, DB & Nelson, KM 1995, A
comparison of methods for self-adaptation in evolutionary
algorithms, BioSystems, vol. 36, no. 2, pp. 157-166. Available
from: Science Direct [26 th August 2012]. Schwefel, H-P 1981,
Numerical Optimization of Computer Models, Wiley, Chichester.
Slide 21
Recombination Produce offspring from 2 or more entities in the
original population Most easily addressed using bitstring
representations
Slide 22
One Point Crossover Entities are represented in the population
as bitstrings of length n Randomly select a crossover point p from
1 to n (inclusive) Take the substring formed by the first p bits of
the first string and append to it the last n-p bits of the second
string to give offspring
Slide 23
One Point Crossover Bitstrings of length 8: 01011100 and
00001111 Choose crossover point of 6 Take the first 6 bits from
01011100 Take the last 2 bits from 00001111 Form the offspring
01011111
Slide 24
Uniform Crossover Form a new offspring from 2 parents by
selecting bits from each parent with a particular probability For
example, given strings: 11001011 and 01010101 Select bits from the
first string with probability
Slide 25
Uniform Crossover Rolled a die 8 times: 2, 3, 6, 6, 3, 1, 3, 5
Whenever the result is 3 or less, take a bit from the first string,
otherwise, take a bit from the second string: 23663135 23663135
11001011 and 01010101 Produce offspring: 11011011
Slide 26
Other Variants Multiple crossover points Multiple parents
Probabilistic application Source: Bck T, Hammel, U & Schwefel,
H-P 1997, Evolutionary computation: comments on the history and
current state, IEEE Transactions on Evolutionary Computation, vol.
1, no. 1, pp. 3-17. Available from: IEEE Xplore Digital Library [23
rd August 2012].
Slide 27
Selection How individuals and their offspring from one
generation are selected to fill the next generation May be
probabilistic or deterministic
Slide 28
Proportional Selection Probabilistic method Assume that fitness
f(x)>0 for every entity x in the population p(y) = f(y) / (sum
of f(x) for every x)
Slide 29
Tournament Selection Probabilistic method Select q individuals
randomly from the population with uniform probability The best
individual of this set goes into the next generation Repeat until
the next generation is filled
Slide 30
(,)-Selection Deterministic method From a generation of
individuals, > offspring are produced The next generation is
produced from the fittest individuals of the offspring The fittest
member of the next generation may not be as fit as the fittest
member of the previous generation
Slide 31
(+)-Selection Deterministic method From a generation of
individuals, offspring are produced The next generation is produced
from the fittest individuals from the + parents and offspring The
fittest members will always survive
Slide 32
Selection Best method (or methods) will be problem specific
Sources: Bck T 1994, Selective pressure in evolutionary algorithms:
a characterization of selection mechanisms, Proceedings of the
First IEEE Conference on Evolutionary Computation, pp. 57-62.
Available from: IEEE Xplore Digital Library [28 th August 2012].
Bck T, Hammel, U & Schwefel, H-P 1997, Evolutionary
computation: comments on the history and current state, IEEE
Transactions on Evolutionary Computation, vol. 1, no. 1, pp. 3-17.
Available from: IEEE Xplore Digital Library [23 rd August
2012].
Slide 33
Travelling Salesman Problem (TSP) Travelling salesman problem:
This is a hard problem (NP-hard, "at least as hard as the hardest
problems in NP") The DP solution is O(n 2.2 n ) Genes are a
sequence representing the order that the cities are visited in
Example: [0 5 3 4 8 2 1 6 7 9]
Slide 34
Crossover in TSP A possible crossover: The greedy crossover.
"Greedy crossover selects the first city of one parent, compares
the cities leaving that city in both parents, and chooses the
closer one to extend the tour. If one city has already appeared in
the tour, we choose the other city. If both cities have already
appeared, we randomly select a non-selected city." Sources: J. J.
Grefenstetts, R. Gopal, B. Rosmaita, and D. Van Gucht. Genetic
Algorithms for the Traveling Salesman problem. In Proceedings of an
International Conference on Genetic Algorithms and Their
Applications, pages 160168, 1985.
Slide 35
Mutation in TSP A possible mutation: The greedy-swap. "The
basic idea of greedy-swap is to randomly select two cities from one
chromosome and swap them if the new (swapped) tour length is
shorter than the old one" Sources: S. J. Louis, R. Tang.
Interactive Genetic Algorithms for the Travelling Salesman Problem.
Genetic Adaptive Systems Lab, University of Neveda, Reno.
1999.
Slide 36
Applications EAs are a very powerful computational tool EAs
find application in: bioinformatics phylogenetics computational
science engineering economics chemistry manufacturing mathematics
physics and other fields
Slide 37
Applications Computer-automated design Automotive design Design
composite materials and aerodynamic shapes to provide faster,
lighter, more fuel efficient and safer vehicles No need to spend
time in labs working with models Engineering design Optimise the
design of many tools/components ie. turbines
Slide 38
Applications Game playing Sequence of actions can be learnt to
win a game Encryption and code breaking Telecommunications DP
problems: Travelling salesman problem Plan for efficient routes and
scheduling for travel planners. Knapsack problem
Slide 39
Sources T. Bck, U. Hammel, and H.-P. Schwefel, Evolutionary
computation: comments on the history and current state, IEEE
Transactions on Evolutionary Computation 1(1), 1997 X. Yao,
Evolutionary computation: a gentle introduction, Evolutionary
Optimization, 2002