8
ReseaRch aRticle Open access Effect of Hill Climbing in GA Aſter Reproduction For Solving Optimization Problems Girdhar Gopal*, Rakesh Kumar, Naveen Kumar and Ishan Jawa Department of Computer Science and Application Kurukshetra University, Haryana 136119 India Gopal et al. 2015. International J Ext Res. 3:79-86 http://www.journalijer.com *Corresponding author e-mail: [email protected] International JOuRnal Of extensive ReseaRch e-Print ISSN: 2394-0301 Introduction Genetic algorithms are adaptive algorithms proposed by John Holland (1975) and were described as adaptive heuristic search algorithms (Goldberg, 1989) based on the evolutionary ideas of natural selection and natural genetics by David Goldberg (1989). They are powerful optimization techniques that employ concepts of evolutionary biology to evolve optimal solutions to a given problem. Genetic algorithm works with a population of individu- als represented by chromosomes. Each chromosome is evaluated by its fitness value as computed by the objective function of the problem. The population undergoes transformation using three primary genetic operators – selection, crossover and mutation which form new generation of population. This process contin- ues to achieve the optimal solution. General structure of genetic algorithm is: Procedure GA(fnx, n, r, m,ngen) //fnx is fitness function to evaluate individuals in population // n is the population size in each generation (say 10) // r is fraction of population generated by crossover (say 0.7) // m is the mutation rate (say 0.01) //ngen is total number of generations P := generate n individuals at random // initial generation is generated randomly nogen:=1 //denotes current generation while nogen <=ngen do { //Selection step: L:= Select(P,n,nogen) // n/2 individuals of P selected using any selection method. //Crossover step: S:= Crossover(L,n) // Generates n chromosomes using arithmetic crossover //Mutation step: Mutation(S,m) // Inversion of chromosomes with mutation rate m //Replacement step: Open Journal Copyright © G Gopal et al. 2015. Licensee IJER 2014. All rights reserved. is is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. Abstract Genetic algorithms (GA) are good to find optimum solutions for a broad class of problems. Hill Climbing is the oldest local search algorithm, which helps in solving optimization problems. One can combine these two traditional techniques to get the best from the mixture. In this paper, use of Hill climbing technique aſter reproduction operator in genetic algorithm is shown, it shows good improvements than Simple genetic algorithm. e experiments have been conducted using five dif- ferent benchmark functions and implementation is carried out using MATLAB. Results show the improvement over simple genetic algorithm. Keywords: Genetic algorithm, hill climbing, crossover, mutation, reproduction. Page 1 of 8

Effect of hill climbing in ga after reproduction for solving optimization problems

Embed Size (px)

Citation preview

Page 1: Effect of hill climbing in ga after reproduction for solving optimization problems

ReseaRch aRticle Open access

Effect of Hill Climbing in GA After Reproduction For Solving Optimization Problems

Girdhar Gopal*, Rakesh Kumar, Naveen Kumar and Ishan JawaDepartment of Computer Science and Application

Kurukshetra University, Haryana 136119 India

Gopal et al. 2015. International J Ext Res. 3:79-86http://www.journalijer.com

*Corresponding author e-mail: [email protected]

International

JOuRnal Of extensive ReseaRch e-Print ISSN: 2394-0301

IntroductionGenetic algorithms are adaptive algorithms proposed by John Holland (1975) and were described as adaptive heuristic search algorithms (Goldberg, 1989) based on the evolutionary ideas of natural selection and natural genetics by David Goldberg (1989). They are powerful optimization techniques that employ concepts of evolutionary biology to evolve optimal solutions to a given problem. Genetic algorithm works with a population of individu-als represented by chromosomes. Each chromosome is evaluated by its fitness value as computed by the objective function of the problem. The population undergoes transformation using three primary genetic operators – selection, crossover and mutation which form new generation of population. This process contin-ues to achieve the optimal solution. General structure of genetic algorithm is:

Procedure GA(fnx, n, r, m,ngen) //fnx is fitness function to evaluate individuals in population // n is the population size in each generation (say 10) // r is fraction of population generated by crossover (say 0.7) // m is the mutation rate (say 0.01) //ngen is total number of generations P := generate n individuals at random // initial generation is generated randomly nogen:=1 //denotes current generation while nogen <=ngen do {//Selection step: L:= Select(P,n,nogen) // n/2 individuals of P selected using any selection method.//Crossover step: S:= Crossover(L,n) // Generates n chromosomes using arithmetic crossover //Mutation step: Mutation(S,m) // Inversion of chromosomes with mutation rate m //Replacement step:

Open Journal

Copyright © G Gopal et al. 2015. Licensee IJER 2014. All rights reserved. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited.

AbstractGenetic algorithms (GA) are good to find optimum solutions for a broad class of problems. Hill Climbing is the oldest local search algorithm, which helps in solving optimization problems. One can combine these two traditional techniques to get the best from the mixture. In this paper, use of Hill climbing technique after reproduction operator in genetic algorithm is shown, it shows good improvements than Simple genetic algorithm. The experiments have been conducted using five dif-ferent benchmark functions and implementation is carried out using MATLAB. Results show the improvement over simple genetic algorithm.

Keywords: Genetic algorithm, hill climbing, crossover, mutation, reproduction.

Page 1 of 8

Page 2: Effect of hill climbing in ga after reproduction for solving optimization problems

Page 2 of 8

http://www.journalijer.comGopal et al. 2015. International J Ext Res. 3:79-86

P:=S pb(i):=min(fitfn(P)) // store best individual in population.i:=i+1 } best:=min(pb) //finds best individual in all generations

end proc

MethodsHybridization in Genetic AlgorithmsGenetic algorithm is a population based search technique. It searches through the state space by exploiting only the coding and objective function in each generation. Introducing problem spe-cific information in any of the genetic operators would form a hybrid algorithm. One technique of this mixing of problem spe-cific knowledge in GA is Memetic algorithm. Memetic algorithms are evolutionary algorithms which use local search with genetic algorithms to refine the individuals. They are inspired by Rich-ard Dawkin’s concept of meme, (Dawkins, 1976). Memetic algo-rithms are also called hybrid evolutionary algorithms, (Moscato and Cotta, 2003). Memetic algorithms can blend the functioning of genetic algorithm with other heuristic search techniques like hill climbing, tabu search etc.

Starting with the research contributed by Bosworth (Bosworth et al. 1972; Bethke, 1980; Brady, 1985), there is a long tradition of hybridizing Evolutionary Algorithms with other optimization methods such as Hill Climbing, Simulated Annealing, or Tabu Search (Sinha and Goldberg, 2003). A comprehensive review on this topic has been provided by Grosan and Abraham (2007). In-teresting research work directly focusing on Memetic Algorithms has been contributed, amongst many others, (Blesa et al. 2001; Bu-riol et al. 2004; Radcliffe and Surry, 1994; Digalakis and Margaritis, 2002; Krasnogor and Smith, 2005). Further early works on Genet-ic Algorithm hybridization ca be found, (Goldberg, 1989; Martina, 1989; Brown et al. 1989). Success of Genetic algorithm primarily depends on Initial population, (Martinez et al. 2004); Beam Search (BS) was integrated with genetic algorithm to seed the initial popu-lation. They started with BS and find the best product line design, and use it in initial population as a member, and then generate remaining N-1 population in a random fashion. Then proceed as with genetic algorithm termination when stopping condition met.

Hill ClimbingHill Climbing (HC) is one of the oldest and simplest local search and optimization algorithm for single objective functions f. The Hill Climbing procedures usually starts either from a random point in the search space or from a point chosen according to some problem-specific criterion. Then, in a loop, the currently known best individual p is modified (with the mutate( )operator) in order to produce one offspring pnew. If this new individual is better than its parent, it replaces it. Otherwise, it is discarded. This procedure is repeated until the termination criterion termination-cri-terion is met. The major problem of Hill Climbing is premature convergence, (Thomas, 2011).

HillClimbing(p)while (termination Criterion){pnew ←− mutate(p)if f(pnew.x) is Better Than f(p.x) p ← pnew}return p

Proposed Memetic AlgorithmGenetic algorithms are starting from a set of solutions, which gen-erated randomly. And then progress slowly towards optima. If we initialize the population with some heuristic search i.e. Hill Climb-ing, the new algorithm can result better in terms of optimum re-sult in less amount of time, also in terms of better optimum solu-tion. The proposed memetic algorithm is defined as:

Procedure GA(fnx, n, r, m, ngen) //fnx is fitness function to evaluate individuals in population // n is the population size in each generation (say 10) // r is fraction of population generated by crossover (say 0.7) // m is the mutation rate (say 0.01) //ngen is total number of generations P := generate n individuals at random // initial generation is generated randomly nogen:=1 //denotes current generation while nogen <=ngen do {//Selection step: L:= Select(P,n,nogen) // n/2 individuals of P selected using any selection method.//Crossover step: S:= Crossover(L,n) // Generates n chromosomes using arithmetic crossover //Mutation step: Mutation(S,m) // Inversion of chromosomes with mutation rate m //Apply Hill climbing on new chromosomes after the crossover and mutation. HillS = HillClimbing (S);//Replacement step: P:= HillS;pb(i):=min(fitfn(P)) // store best individual in population.i:=i+1 } best:=min(pb) //finds best individual in all generations

end proc

Results and Discussion(i) Experimental Set-up In this paper, four different functions are examined in order to compare performance of genetic algorithms and proposed mi-metic algorithm. Table 1 lists the five test functions – their names, type and their description.

Page 3: Effect of hill climbing in ga after reproduction for solving optimization problems

Page 3 of 8

Function Name Type

F1 Sphere Function Unimodal

F2 Rosenbrock’s Function Unimodal

F3 Rastrigin’s Function Multimodal

F4 Schwefel’s Function Multimodal

Table 1. Description of Functions used in Experiments

First two functions are by De Jong and unimodal (only one opti-ma) functions, whereas, other two are multilodal (containing many local optimas, but only one global optima) functions. Sphere [F1] is simple quadratic parabola. It is smooth, unimodal, strongly con-vex, symmetric, (Digalakis and Margaritis, 2001; Salomon 1996).

Rosenbrock [F2] is considered to be difficult, because it has a very narrow ridge. The tip of the ridge is very sharp, and it runs around a parabola. The global optimum is inside a long, narrow

parabolic shaped flat valley, Digalakis and Margaritis, 2001; Salo-mon 1996).

Schwefel’s function (F4) is deceptive in that the global mini-mum is geometrically distant, over the parameter space, from the next best local minima. Therefore, the search algorithms are po-tentially prone to convergence in the wrong direction, Digalakis and Margaritis, 2001; Salomon 1996).

∑==

n

i ixxF1

2)(1

-5.12 ≤ x(i) ≤ 5.12global minimum: fn(x)=0, xi=0, i=1:n

2221

1 1 )1().(100 xxx ii

n

i i −+−∑ −

= +

-2.048<=x(i)<=2.048global minimum: f(x)=0; x(i)=1, i=1:n

F2(x) =

http://www.journalijer.comGopal et al. 2015. International J Ext Res. 3:79-86

Page 4: Effect of hill climbing in ga after reproduction for solving optimization problems

Page 4 of 8

F3(x) =

-5.12 ≤ x(i) ≤ 5.12global minimum: f(x)=0; x(i)=0, i=1:n

))..2cos(.10(.10 2

1 xx ii

n

in π−+∑ =

)sin(.)(41 xx i

n

i ixF ∑=−=

-500 ≤ x(i) ≤500global minimum: f(x)=-n•418.9829; x(i)=420.9687, i=1:n

The following parameters are used in this implementation: • Population size (N): 20, 50, 100• Number of generations (ngen) : 100 and 500• Selection method: Roulette Wheel Selection (RWS)

N 20 50 100

Gen = 100 SGA 0.012 0.0016 1.03E-08

MA 0.00883 3.72E-09 3.82E-08

Gen = 500 SGA 0.057 3.41E-06 5.13E-10

MA 3.42E-07 1.41E-08 5.37E-15

Table 2. Minimum values for F1

• Crossover Operator: Arithmetic Crossover (alpha = 0.35)• Mutation: uniform with mutation probability 0.1% • Algorithm ending criteria: Execution stops on reaching ngen

generations

http://www.journalijer.comGopal et al. 2015. International J Ext Res. 3:79-86

Page 5: Effect of hill climbing in ga after reproduction for solving optimization problems

Page 5 of 8

• Fitness Function: Objective value of function

(ii) Experimental ResultsAverage and minimum values of each function is recorded and examined for further analysis. Average and minimum values of each function is recorded and examined for further analysis.

Following two figures are the results of the simulation run in MATLAB for different generations showing minimum fitness of each generation.

N 2.00E+01 5.00E+01 100

Gen = 100 SGA 1.15E+00 2.90E-01 0.45

MA 2.90E-01 5.60E-03 0.0096

Gen = 500 SGA 0.56 0.45 0.024

MA 0.0011 0.0048 0.0005

Table 3. Minimum values for F2

Following two figures are the results of the simulation run in MATLAB for different generations showing minimum fitness of each generation.

http://www.journalijer.comGopal et al. 2015. International J Ext Res. 3:79-86

Page 6: Effect of hill climbing in ga after reproduction for solving optimization problems

N 2.00E+01 5.00E+01 100

Gen = 100 SGA 3.90E-02 3.89E-02 1.87E-04

MA 3.75E-06 5.32E-08 1.79E-08

Gen = 500 SGA 0.0049 1.98E-04 7.38E-07

MA 2.90E-07 1.39E-08 6.32E-10

Table 4. Minimum values for F3

N 2.00E+01 5.00E+01 100

Gen = 100 SGA -5.45E+02 -4.78E+02 -3.86E+02

MA -7.50E+02 -7.78E+02 -7.90E+02

Gen = 500 SGA -656 -8.34E+02 -8.10E+02

MA -8.12E+02 -7.99E+02 -8.16E+02

Table 5. Minimum values for F4

Following two figures are the results of the simulation run in MATLAB for different generations showing minimum fitness of each generation.

Page 6 of 8

http://www.journalijer.comGopal et al. 2015. International J Ext Res. 3:79-86

Page 7: Effect of hill climbing in ga after reproduction for solving optimization problems

Following two figures are the results of the simulation run in MATLAB for different generations showing minimum fitness of each generation.

ConclusionsWhile solving any optimization problem one must have to keep in mind that one solution for the entire optimization problems may never be found. It depends on the problem that one has to tune his solutions slightly according to the nature of the problem. Sometime local search is better also; no doubt global search has very few chances to get stuck in local optima. So the combination of these two schemes is also a balanced search method, which has speed of local search and accuracy of global search. When genetic algorithm generates new child for next generation then a tuning of them for their locality is a good promising achievement in the life cycle of genetic algorithm. This incorporation lead GA to the memetic algorithm. In this paper memetic algorithm shows better results in comparison to simple GA. So in future the exploitation of this hybridization may lead to better solutions.

Conflict of interestsThe authors declare that they have no conflict of interests.

References1. Holland J., Adaptation in natural and artificial systems, Uni-

versity of Michigan Press, Ann Arbor, 1975.2. Goldberg D. E., Genetic algorithms in search, optimization,

and machine learning, Addison Wesley Longman, Inc., ISBN

Page 7 of 8

0-201- 15767-5, 1989.3. Dawkins R.,”The Selfish Gene”, Oxford University Press,

Oxford, 1976. 4. Moscato P., Cotta C., “A gentle introduction to memetic al-

gorithms”, Handbook of metaheuristics, 2003, pp 105-144.5. Bosworth Jack, Foo Norman, and Zeigler Bernard P. “Com-

parison of Genetic Algorithms with Conjugate Gradient Methods”. Technical Report 00312-1-T, University of Michi-gan: Ann Arbor, MI, USA, February 1972.

6. Bethke Albert Donally. “Genetic Algorithms as Function Optimizers”. PhD thesis, University of Michigan: Ann Ar-bor, MI, USA, 1980.

7. Brady R. M. “Optimization Strategies Gleaned from Biolog-ical Evolution.” Nature, 317(6040): 804–806, October 31, 1985. doi: 10.1038/317804a0.

8. Sinha Abhishek and Goldberg D.E. “A Survey of Hybrid Ge-netic and Evolutionary Algorithms”. IlliGAL Report 2003-2004, Illinois Genetic Algorithms Laboratory (IlliGAL), Department of Computer Science, Department of General Engineering, University of Illinois at Urbana-Champaign: Urbana-Champaign, IL, USA, January 2003.

9. Grosan Crina and Abraham Ajith. “Hybrid Evolutionary Al-gorithms: Methodologies, Architectures, and Reviews.”, pag-es 1–17. Springer-Verlag: Berlin/Heidelberg, 2007.

10. Blesa Maria J., Moscato Pablo, and Xhafa Fatos. “A Memet-ic Algorithm for the Minimum Weighted k-Cardinality Tree

http://www.journalijer.comGopal et al. 2015. International J Ext Res. 3:79-86

Page 8: Effect of hill climbing in ga after reproduction for solving optimization problems

Page 8 of 8

Article Information:Received: 12 February 2015 Accepted: 18 March 2015Online published: 25 March 2015

Cite this article as:G Gopal et al. 2015. Effect of hill climbing in GA after reproduction for solving optimization problems. International Journal of Extensive Research. Vol. 3: 79-86.

Subgraph Problem”. 4th Metaheuristics International Con-ference, pages 85–90, 2001.

11. Buriol Luciana, Fran Paulo M. and Moscato Pablo. “A New Memetic Algorithm for the Asymmetric Traveling Salesman Problem”. Journal of Heuristics, 10(5):483–506, September 2004,

12. Radcliffe Nicholas J. and Surry Patrick David. “Formal Me-metic Algorithms”. International Workshop on Evolutionary Computing, Selected Papers, pages 1–16, 1994.

13. Digalakis J.G. and Margaritis K.G.. “An Experimental Study of Benchmarking Functions for Genetic Algorithms.” Inter-national Journal of Computer Mathematics, 79:4, 403-416, 2002.

14. Krasnogor Natalio and Smith James E.. “A Tutorial for Com-petent Memetic Algorithms: Model, Taxonomy, and Design Issues”. IEEE Transactions on Evolutionary Computation (IEEE-EC), October 2005

15. Gorges-Schleuter Martina. “ASPARAGOS: An Asynchro-nous Parallel Genetic Optimization Strategy”. Proceedings of the 3rd International Conference on Genetic Algorithms , pages 422–427, 1989.

16. Brown Donald E., Huntley Christopher L., and Spillane Andrew R.. “A Parallel Genetic Heuristic for the Quadratic Assignment Problem”. Proceedings of the 3rd International Conference on Genetic Algorithms, pages 406–415, 1989.

17. Martinez-Estudillo A., Hervas-Martnez C., Martnez-Estudi-llo F., and Garca-Pedrajas N., 2004, “Hybrid method based on clustering for evolutionary algorithms with local search,” IEEE Transactions on Systems, Man and Cybernetics.

18. Weise Thomas, “Global Optimization Algorithms – Theory and Application –“, 2011, 3rd Edition.

19. Digalakis Jason and Margaritis Konstantinos. “A Parallel Me-metic Algorithm for Solving Optimization Problems. “4th Metaheuristics International Conference, pages 121–125, 2001.

20. Salomon R. “Re-evaluating genetic algorithm performance under coordinate rotation of benchmark functions: A sur-vey of some theoretical and practical aspects of genetic algo-rithms”, Elsevier:BioSystems, 39, 263-278, 1996.

**********

http://www.journalijer.comGopal et al. 2015. International J Ext Res. 3:79-86