1
Application of Genetic Algorithms for training of Neural Networks
by
Gautham Anil (04305013)Kuldeep Gharat (04305010)Vamshi Krishna (04305015)
under the guidance of Dr. Pushpak Bhattacharya
2
Agenda of todays seminar
■ Genetic Algorithms
■ Evolving weights of Neural Network using Genetic Algorithm
■ Genetic Algorithms and local minima
■ Using Genetic Algorithms for evolving neural network topologies
3
What are Genetic Algorithms?
■ Formally introduced in 1970s by John Holland.
■ Mimics processes in nature that led to evolution of intelligent organisms.
■ Components of GA are
➔
➔
4
Basic Genetic Algorithm
■ The basic Genetic Algorithm
5
Crossover of chromosomes
■ Crossover is the mixing of genomes from two parents in the hope of getting a child better than the parents.
■ The end points of the copied segment is called Crossover Point.
6
Mutation of a parent
■ Mutation alters the genome randomly.
7
Applying GA to Neural Networks
■ Potential parameters for optimization.
➔
➔
➔
■ Analyze a neural network
8
Encoding weights
■ Encoding of weights of a neural network.
■ Fitness function of a chromosome is MSE of the test patterns.
9
Selection of crossover points
■ Selection of crossover points
10
Crossover
■ Crossover
11
Mutation of weights
■ Mutation of weights
12
Evolving Initial weights (hybrid evolution)
■ Disadvantage of previous algorithm
13
Hybrid Algorithm
■ Hybrid Algorithm
14
BP trapped in local minima
■ BP getting stuck in a local minima
15
Escape from local minima
■ GA trapped in a local minima escapes.
16
How does GA handle local minima?
■ This GA might not escape from wider local minima.
■ Crossover helps escape from local minima.
■ The probability of escaping local minima can be increased by increasing the number of individuals in the population.
17
Local Minima and Evolution
■ Relevance of local minima to Evolution.
■ Need to rethink “Survival of the fittest”
18
Encoding topology of a Neural Network
■ Evolving weights alone means fixed topology
■ Topology be represented as a genome.
■ An evaluation function to determine the quality of a topology.
■ Common properties of evaluation function
■ The constructive or pruning methods might not be optimal.
19
Direct encoding of topology
■ Direct Encoding – full information – might lead to the permutation problem.
20
Indirect Encoding of topology
■ Indirect Encoding – Only partial information about the topology is stored.
21
Evolving topology of a Neural Network
■ The GA for evolving the topology of the Neural Network
22
Issues while using Genetic Algorithms
■ As of now, many orders slower than back propagation (that does not get stuck in local minima).
■ Might take a long time to escape from some particularly bad local minima.
■ In current version of GA, genetic diversity is hard to maintain.
■ Never know when the global minima has been found.
23
Conclusion
■ Escapes from local minima quickly in most cases.
■ Does not need gradient information of the Evaluation function. So, the evaluation function could be nondifferentiable (like network topology evaluation functions).
■ Can do a complete search of the solution space.
■ Proper use of crossover encourages development of modular neural networks.
■ Has potential to be fully autonomous problem solving algorithm.
24
References
■ Yao X., 'Evolving artificial neural networks', Proceedings of the IEEE vol. 87, no. 9, p.1423 – 1447., 1999. http://citeseer.ist.psu.edu/yao99evolving.html
■ A J F van Rooij, L C Jain, R P Johnson, “Neural Network Training using Genetic Algorithms”, World Scientific, 1996
■ J. Heitkötter and D. Beasley, “The Hitchhikers Guide to Evolutionary computation”, 2000
■ Hojjat Adeli, ShihLin Hung, “Machine Learning: Neural Networks, Genetic Algorithms and fuzzy systems”, 1995
■ http://lancet.mit.edu/~mbwall/presentations/IntroToGAs/