8
Abstract— Free Search Differential Evolution (FSDE) is a new, population-based meta-heuristic algorithm that is a hybrid of concepts from Free Search (FS), Differential Evolution (DE) and opposition-based learning. The performance of the proposed approach is investigated and compared with DE and one of the recent variants of DE when applied to ten benchmark functions. The experiments conducted show that FSDE provides excellent results with the added advantage of no parameter tuning. I. INTRODUCTION P OPULATION-based algorithms have been successfully applied to a wide range of optimization problems, for example, image processing, pattern recognition, scheduling, engineering design, amongst others [1],[2]. Free Search (FS) [3] and Differential Evolution (DE) [4] are two stochastic, population-based optimization methods. The performance of these methods are greatly influenced by their control parameters. The performance of FS is affected by the boundaries of the frame of a neighborhood space and the number of steps per exploration walk. Furthermore, FS is adversely affected by the presence of noise. The performance of DE, on the other hand, is influenced mainly by the scale parameter and the probability of recombination. Although recommendations for values of these parameters have been made in the literature (based on empirical studies) [5], these values are not universally applicable. The best values for DE control parameters remain problem dependent, and need to be fine tuned for each problem. Manuscript received September 20, 2008. M. G. Omran is with the Department of Computer Science, Gulf University for Science and Technology, Kuwait (phone: +965-886644; e- mail: [email protected]). A. P. Engelbrecht is with the Department of Computer Science, University of Pretoria, Pretoria, South Africa (e-mail: [email protected]). This paper proposes an effective population-based meta- heuristic approach integrating concepts from FS, DE and opposition-based learning [6]. The proposed algorithm is called Free Search Differential Evolution (FSDE). FSDE requires no parameter tuning (except for the population size and maximum number of iterations). FSDE is compared against DE and one of its recent variants [7] on ten benchmark functions. The remainder of the paper is organized as follows: FS is summarized in Section II. DE is discussed in Section III. Opposition based learning (OBL) is briefly reviewed in Section IV. FSDE is presented in Section V. Section VI presents and discusses the results of the experiments. Finally, Section VII concludes the paper. II. FREE SEARCH Free Search (FS) is a recent, population-based method proposed by [3]. FS mimics the behavior of animals in nature and their daily exploration for good conditions. In FS, a population of search agents, called animals, starts random walks from random (or certain) locations in the search space. During the walk, animals remember the best location they have found. Each animal has a sense and uses it to choose the starting location for the next walk. Different animals have different senses and the sense varies during the optimization process. The concept of the sense is one of the main peculiarities of FS that distinguishes it from other population-based methods. The sense enables FS to distinguish between the search agents and the found solutions. It enables animals to orient themselves within the search space. The animal’s sense is influenced by the experience (of the animal itself or other animals) of the previous iterations but it does not restrict the animal’s ability to explore any other location. In FS, animals use previous experience as a guide but not as a rule. Free Search Differential Evolution Mahamed G. H. Omran and Andries P. Engelbrecht 110 978-1-4244-2959-2/09/$25.00 c 2009 IEEE

[IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

Embed Size (px)

Citation preview

Page 1: [IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

Abstract— Free Search Differential Evolution (FSDE) is a new,

population-based meta-heuristic algorithm that is a hybrid of

concepts from Free Search (FS), Differential Evolution (DE)

and opposition-based learning. The performance of the

proposed approach is investigated and compared with DE and

one of the recent variants of DE when applied to ten

benchmark functions. The experiments conducted show that

FSDE provides excellent results with the added advantage of no

parameter tuning.

I. INTRODUCTION

POPULATION-based algorithms have been successfully

applied to a wide range of optimization problems, for

example, image processing, pattern recognition, scheduling,

engineering design, amongst others [1],[2].

Free Search (FS) [3] and Differential Evolution (DE) [4]

are two stochastic, population-based optimization methods.

The performance of these methods are greatly influenced by

their control parameters. The performance of FS is affected

by the boundaries of the frame of a neighborhood space and

the number of steps per exploration walk. Furthermore, FS is

adversely affected by the presence of noise. The

performance of DE, on the other hand, is influenced mainly

by the scale parameter and the probability of recombination.

Although recommendations for values of these parameters

have been made in the literature (based on empirical studies)

[5], these values are not universally applicable. The best

values for DE control parameters remain problem

dependent, and need to be fine tuned for each problem.

Manuscript received September 20, 2008.

M. G. Omran is with the Department of Computer Science, Gulf

University for Science and Technology, Kuwait (phone: +965-886644; e-

mail: [email protected]).

A. P. Engelbrecht is with the Department of Computer Science,

University of Pretoria, Pretoria, South Africa (e-mail: [email protected]).

This paper proposes an effective population-based meta-

heuristic approach integrating concepts from FS, DE and

opposition-based learning [6]. The proposed algorithm is

called Free Search Differential Evolution (FSDE). FSDE

requires no parameter tuning (except for the population size

and maximum number of iterations). FSDE is compared

against DE and one of its recent variants [7] on ten

benchmark functions.

The remainder of the paper is organized as follows: FS is

summarized in Section II. DE is discussed in Section III.

Opposition based learning (OBL) is briefly reviewed in

Section IV. FSDE is presented in Section V. Section VI

presents and discusses the results of the experiments.

Finally, Section VII concludes the paper.

II. FREE SEARCH

Free Search (FS) is a recent, population-based method

proposed by [3]. FS mimics the behavior of animals in

nature and their daily exploration for good conditions. In FS,

a population of search agents, called animals, starts random

walks from random (or certain) locations in the search space.

During the walk, animals remember the best location they

have found. Each animal has a sense and uses it to choose

the starting location for the next walk. Different animals

have different senses and the sense varies during the

optimization process. The concept of the sense is one of the

main peculiarities of FS that distinguishes it from other

population-based methods. The sense enables FS to

distinguish between the search agents and the found

solutions. It enables animals to orient themselves within the

search space. The animal’s sense is influenced by the

experience (of the animal itself or other animals) of the

previous iterations but it does not restrict the animal’s ability

to explore any other location. In FS, animals use previous

experience as a guide but not as a rule.

Free Search Differential Evolution

Mahamed G. H. Omran and Andries P. Engelbrecht

110978-1-4244-2959-2/09/$25.00 c© 2009 IEEE

Page 2: [IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

According to [3], FS generally performs better than DE

and Particle Swarm Optimization (PSO) [8] when applied to

five problems with different characteristics and

complexities.

However, there are some drawbacks of FS, one of which

is that FS requires the user to specify two important

parameters (in addition to the population size and the

number of iterations). These parameters are the boundaries

of the frame of a neighborhood space and the number of

steps per exploration walk. These parameters affect the

performance of FS. In addition, the performance of FS

degrades with the presence of noise [3]. In addition, animals

perform random walks without utilizing the diversity of the

population. Moreover, FS generally has slower convergence

than DE [3]. Furthermore, FS is not easy to implement (the

authors tried to implement the method but the results were

too poor. We contacted Dr. Kalin Penev but he said that the

implementation of FS requires key knowledge not available

in conventional literature and he did not agree to give us his

code. This is the reason for not comparing our method with

FS). Finally, the performance of FS when applied to real

engineering problems, needs to be investigated.

III. DIFFERENTIAL EVOLUTION

Differential evolution (DE) is an evolutionary population-

based algorithm proposed by [4]. Although DE has some

similarities with other evolutionary algorithms (EA), it

differs significantly in the sense that distance and direction

information from the current population is used to guide the

search process. DE uses the differences between randomly

chosen vectors (individuals) as the source of random

variations for a third vector (individual), referred to as the

target vector. Trial solutions are generated by adding

weighted difference vectors to the target vector. This process

is referred to as the mutation operator where the target

vector is mutated. A recombination, or crossover step is then

applied to produce an offspring which is only accepted if it

improves on the fitness of the parent individual.

Empirical studies have shown that DE performance is

sensitive to its control parameters [9],[10]. Performance can

be greatly improved if parameter values are optimized. A

number of DE strategies have been developed where values

for control parameters adapt dynamically [11]-[15]. Another

problem with DE is that it is not rotationally invariant (i.e.

the performance of DE depends on the orientation of the

coordinate system in which the objective function is

evaluated). The main reason for this problem is the

crossover. Only when the crossover rate Cr = 1 (only

mutation and no crossover), the DE becomes rotationally

invariant [2]. However, using Cr = 1 generally degrades the

performance of DE. Because DE is not rotationally

invariant, empirical results show that it has significant

difficulty on non-separable functions [16]. Moreover, in the

case where all individuals that take part in the reproduction

process are on a flat surface, the DE suffers in the sense that

fitness will not be improved [3].

For recent advances in DE, the reader is referred to [17].

IV. OPPOSITION-BASED LEARNING

Opposition-based learning (OBL) was first introduced by [6]

and was successfully applied to several problems [18].

Opposite numbers are defined as follows:

Let x [a,b], then the opposite number x’ is defined as

x’ = a + b – x

The above definition can be extended to higher dimensions

as follows:

Let P x1 , x2 , , xn( ) be an n-dimensional vector, where

xi [ai,bi] and i=1, 2, …, n. The opposite vector of P is

defined by P x1' , x2

' , , xn'( ) where

xi'

= ai + bi xi

V. FREE SEARCH DIFFERENTIAL EVOLUTION

In this paper, a new population-based optimization approach

is proposed. The new approach, called Free Search

Differential Evolution (FSDE), is based on concepts from FS

(the concept of the sense), DE (its mutation operator) and

OBL. FSDE addresses the drawbacks of FS and DE without

requiring any extra parameters.

2009 IEEE Congress on Evolutionary Computation (CEC 2009) 111

Page 3: [IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

FSDE starts with a set of random locations, the fitness of

these locations are evaluated. These initial locations are

marked as the best solution found so far. The senses of a set

of animals (the same size as the locations set) are randomly

chosen. Animals can conduct a local or global search (as will

be explained in detail below). In local search, the animal will

exploit its marked location in a small step size. In global

search, each animal selects a marked location (if its fitness is

better than the animal’s sense) and explores around the

marked location. In both cases, the animal updates the

marked location if it finds a better location. Based on the

current iteration number, the algorithm works such that

global search is favored at the beginning of the run while

local search is favored at the end of the search. This process

is repeated until a stopping criterion is met.

The FSDE algorithm is described in more detail below:

1. A population, X, of n potential solutions, called

locations, are randomly chosen,

xi,j = LBj + rj (UBj – LBj)

where rj~U(0,1), xi,j [LBj, UBj], i =1,2,…,n,

j=1,2,…,Nd and Nd is the dimension of the problem.

2. Calculate the fitness of the locations, f(xi), i

=1,2,…,n.

3. Set the marked locations to the initial solutions,

pi = xi and f(pi) = f(xi), i =1,2,…,n.

4. Assuming minimization, normalize the fitness values

of each marked location,

nf(i) = (f(i) – fmin)/(fmax-fmin)

where i =1,2,…,n, fmin is the fitness of the worst

marked location and fmax is the fitness of the best

marked location This step is done only when

fmax fmin.

5. The sense of n animals are randomly chosen between

0 and 1,

s(i) ~U(0,1), i =1,2,…,n

6. For each animal, i =1,2,…,n, a location, xi, is

explored,

xi =

pi + N 0,1( ), if fmin = fmax or rand <g

G

pk + ln1

randxl xm( ), otherwise

where k is the index of a marked location with a

normalized fitness better than the sense of the

current animal (i.e. nf(k) s(i)), l and m ~U(1,n)

with l m k, g is the current iteration, G is the

maximum number of iterations and rand ~ U(0,1).

Note that, N(0,1) represents a single sample (i.e. a

scale factor) of Gaussian noise.

7. For each marked location i, replace pi with xi if f(xi)

f(pi), otherwise, keep pi intact.

8. Find the marked location with the lowest fitness, pb

and find its random opposite,

xb,j = LBj + UBj - rj pb,

9. Replace pb with xb if f(xb) f(pb).

10. Repeat steps 4 to 9 G times.

In Step 5, the senses of the animals are randomly chosen

such that animals with a large sensibility value (i.e. close to

1) can select marked locations with low quality, while

animals with small sensibility value (i.e. close to 0) explores

high quality marked locations. Note that the animals’ senses

are chosen randomly for each iteration. Thus, as in FS, an

animal can select any marked location that suits its sense.

In Step 6, FSDE performs either a local search (i.e.

exploitation) or a global search (i.e. exploration). Local

search is conducted if the fitness of all marked locations are

the same or if a random number is less than g/G. The value

of g/G is small (close to zero) at the beginning of the run

while close to 1 at the end of the run. Thus, exploration is

favored at the beginning of the search while exploitation is

preferred at the end of the search. Exploitation is achieved

by adding/subtracting a small random number to the current

animal’s marked location. Exploration, on the other hand, is

done by adding the weighted difference between to

randomly chosen current animal locations to a marked

location whose fitness is better than the sense of the current

animal. Furthermore, local search is performed when all the

112 2009 IEEE Congress on Evolutionary Computation (CEC 2009)

Page 4: [IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

marked locations are the same. This local search will be

useful when the landscape of the function to be optimized is

relatively flat (as in the Rosenbrock function described in the

next section), because If all of the individuals are located on

a flat area in the search space, then they will all remain in

that area, as offspring is just a linear combination of three of

the individuals, and a discrete combination of another

individual. There is no way that these individuals will be

able to escape the flat surface provided that they are all

located on that flat surface. Thus, local search addresses one

of the drawbacks of DE (i.e. dealing with flat functions).

The rationale behind Step 8 is the basic idea of

opposition-based learning: for a random guess, which is very

far away from the existing solution (in worst cases in the

opposite location), then the algorithm should search in the

opposite direction. For FSDE, the guess that is “very far

away from the existing solution” is the marked location with

the lowest fitness. However, rather than going directly to the

opposite location, a random number is used to explore the

area between the worst marked location and its opposite.

Note that FSDE does not require any parameter to be

specified by the user (except for the population size and the

maximum number of iterations). In addition, it addresses the

drawbacks of FS and DE mentioned in Sections II and III. A

MATLAB code of FSDE is shown in the Appendix.

VI. EXPERIMENTAL RESULTS

This section compares the performance of FSDE with that of

DE (DE/rand/1/bin) and one recent variant of DE, Barebones

DE (BBDE) [7]. The BBDE algorithm combines the

strengths of both the PSO and DE to form a new, efficient,

almost parameter-free hybrid optimization algorithm. Omran

et al. [7] shows that BBDE generally performed better than

DE and two variants of barebones PSO [19] when applied to

a set of benchmark functions. For DE, F = 0.5 and Cr = 0.9,

as suggested in [5]. For both algorithms used in this section,

n = 50. All functions were implemented in 30 dimensions.

The initial population was generated from a uniform

distribution in the ranges specified below.

The following functions have been used to compare the

performance of FSDE with that of DE and BBDE. These

benchmark functions provide a balance of unimodal,

multimodal, separable, non-separable and noisy functions.

For each of these functions, the goal is to find the global

minimizer, formally defined as

Given f: dN

find dNx such that dNff xxx ),()(

The following functions were used:

A. Sphere function, defined as

=

=

dN

i

ixf1

2)(x

where 0=x and 0)( =xf for 100100ix .

B. Rosenbrock function, defined as

f (x) = 100 xi xi 12( )

2+ xi 1 1( )

2( )i=1

Nd 1

where x = 1 and f (x ) = 0 for 2 xi 2 .

C. Rotated hyper-ellipsoid function, defined as

f (x) = x jj=1

i2

i=1

Nd

where x = 0 and f (x ) = 0 for 100 xi 100 .

D Step function, defined as

where x = 0 and f (x ) = 0 for .

E. Rastrigin function, defined as

where x = 0 and f (x ) = 0 for .

F. Normalized Schwefel function, defined as

f (x) =

xisin( xi )i=1

Nd

Nd

where x = (420.9687, , 420.9687) and

f (x ) = - 418.982887 for 512 xi 512 .

2009 IEEE Congress on Evolutionary Computation (CEC 2009) 113

Page 5: [IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

G. Griewank function, defined as

where and for .

H. Salomon function, defined as

f (x) = cos 2 xi2

i=1

Nd

+ 0.1 xi2

i=1

Nd

+1

where 0=x and 0)( =xf for 100100ix .

I. Norwegian function, defined as

f x( ) = cos xi3( )99 + xi100i=1

Nd

where x = 1 and f (x ) = -1 for 1.1 xi 1.1.

This function has a local optimum at -0.98.

J. Quartic function, i.e., noise defined as

f (x) = ixi4

i=1

Nd

+ random[0,1)

where x = 0 and f (x ) = 0 for 1.28 xi 1.28 .

Sphere, Rosenbrock and Rotated hyper-ellipsoid are

unimodal, while the Step function is a discontinuous

unimodal function. Rastrigin, Normalized Schwefel,

Griewank, Salomon and Norwegian functions are difficult

multimodal functions. Finally, the Quartic function is a

noisy function. Rosenbrock, Rotated hyper-ellipsoid and

Salomon functions are non-separable functions.

The results reported in this section are averages and

standard deviations over 30 simulations. Each simulation

was allowed to run for 50 000 evaluations of the objective

function. The statistically significant best solutions have

been shown in bold (using the z-test with = 0.05).

Table I summarizes the results obtained by applying DE,

BBDE and FSDE to the benchmark functions. The results

show that FSDE outperformed DE in seven out of ten

benchmark functions. In the remaining three functions, no

statistically significant difference exists between DE and

FSDE. On the other hand, FSDE outperformed BBDE on

eight out of ten benchmark functions. No statistically

significant difference exists in the remaining two functions.

The results of the Quartic function show that FSDE works

well in the presence of noise. The expected reason for the

relatively poor performance of DE when applied to the

Rosenbrock function is that this function has a relatively

smooth (flat) landscape where it is difficult to differentiate

the optimum from similar neighborhood locations. Thus,

Rosenbrock requires precise local search [3]. This local

search is provided by FSDE and this explains the excellent

result achieved by FSDE. For the Norwegian function, both

DE and BBDE were trapped in a local optimum while FSDE

reached the vicinity of the global optimum.

The number of function evaluations (FEs) required to

reach an error value less than 10-6

(provided that the

maximum limit is 50,000 FEs) was recorded in the 30 runs

and the mean and standard deviation of FEs were calculated

and shown in Table I between brackets. FEs can be used to

compare the convergence speed of the different methods. A

smaller FE means faster convergence speed. On the other

hand, having FEs equal to 50,000 indicates that the

algorithm did not converge to the global optima. Table I

shows that FSDE generally reached good solutions faster

than (or equal to) DE and BBDE in all the benchmark

functions (except for the Sphere function where BBDE was

the fastest approach). Figure 1 illustrates results for selected

functions. The figure generally shows that FSDE reached

good solutions faster than both DE and BBDE.

VII. CONCLUSIONS

This paper investigated a new population-based meta-

heuristic algorithm, called FSDE, as a hybrid of Free Search,

Differential Evolution and opposition-based learning that

requires no parameter tuning. The approach was tested on

ten benchmark functions. These benchmark functions

provide a balance of unimodal, multimodal, separable, non-

separable and noisy functions. The results show that FSDE

generally outperformed Differential Evolution and one of its

recent variants.

Future work will investigate the performance of FSDE

when applied to real engineering optimization problems.

114 2009 IEEE Congress on Evolutionary Computation (CEC 2009)

Page 6: [IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

REFERENCES

[1] A. Engelbrecht, Fundamentals of Computational Swarm

Intelligence, Wiley & Sons, 2005.

[2] K. Price and R. Storn and J. Lampinen, Differential Evolution:

A Practical Approach to Global Optimization, Springer, 2005.

[3] K. Penev and G. Littlefair, “Free search – a comparative

analysis,” Information Science, vol. 172, pp. 173-193, 2005.

[4] R. Storn and K. Price, “Differential evolution - a simple and

efficient adaptive scheme for global optimization over continuous

spaces,” Technical Report TR-95-012, International Computer

Science Institute, 1995.

[5] R. Storn and K. Price, “Differential evolution -- a simple and

efficient heuristic for global optimization over continuous spaces,”

Journal of Global Optimization, vol. 11(4), pp. 431—359, 1997.

[6] H. Tizhoosh, “Opposition-based learning: a new scheme for

machine intelligence,” In Proceedings Int. Conf. Comput. Intell.

Modeling Control and Autom, vol. I, pp. 695—701, 2005.

[7] M. Omran, A. Engelbrecht and A. Salman, “Bare bones

differential evolution,” European Journal of Operation Research,

Elsevier, vol. 196, no. 1, pp. 128-139, 2009.

[8] J. Kennedy and R.C. Eberhart, “Particle Swarm Optimization,”

in Proceedings of the IEEE International Joint Conference on

Neural Networks, pp. 1942—1948, IEEE Press, 1995.

[9] J. Liu and J. Lampinen, “A fuzzy adaptive differential evolution

algorithm,” in Proceedings of the IEEE International Region 10

Conference, pp. 606-611, 2002.

[10] R. Gämperle, S. Müller and P. Koumoutsakos, “A parameter

study for differential evolution,” in Proceedings of WSEAS

International Conf. Advances Intell. Syst., Fuzzy Syst., Evol.

Comput., pp. 293-298, 2002.

[11] H. Abbass and R. Sarker and C. Newton, “PDE: a Pareto-

frontier differential evolution approach for multi-objective

optimization problems,” in Proceedings of the IEEE Congress on

Evolutionary Computation, vol. 2, pp. 971—978, 2001.

[12] M. Omran, A. Salman and A. Engelbrecht, “Self-adaptive

differential evolution,” in Lecture Notes in Artificial Intelligence,

vol. 3801, pp. 192-199, 2005.

[13] A. Qin and P. Suganthan, “Self-adaptive differential evolution

algorithm for numerical optimization,” in Proceedings of the IEEE

Congress on Evolutionary Computation, vol. 2, pp. 1785—1791,

2005.

[14] J. Rönkkönen and S. Kukkonen and K.V. Price, “Real-

parameter optimization with differential evolution,” in Proceedings

of the IEEE Congress on Evolutionary Computation, vol. 1, pp.

506—513, 2005.

[15] J. Zhang and A. Sanderson, “JADE: Self-adaptive differential

evolution with fast and reliable convergence performance,” in

Proceedings of the IEEE Congress on Evolutionary Computation,

pp. 2251-2258, 2007.

[16] A. Sutton, M. Lunacek and L. Whitley, “Differential evolution

and Non-separability: Using selective pressure to focus search,”

GECCO’07, pp. 1428—1435, 2007.

[17] Chakraborty (Ed.), Advances in differential evolution,

Springer, Berlin, 2008.

[18] S. Rahnamayan, H. Tizhoosh and M. Salama, “Opposite-based

differential evolution,” IEEE Trans. On Evolutionary Computation,

vol. 12(1), pp. 107—125, 2008.

[19] J. Kennedy, “Bare bones particle swarms,” in Proceedings of

the IEEE Swarm Intelligence Symposium", pp. 80—87, April,

2003.

2009 IEEE Congress on Evolutionary Computation (CEC 2009) 115

Page 7: [IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

TABLE I

MEAN AND STANDARD DEVIATION (±SD) OF THE FUNCTION OPTIMIZATION RESULTS

Function DE BBDE FSDE

Sphere 0(0)

[33640.0 (900.899933)]

0(0)

[9353.333333(354.024481)]

0(0)

[18770.4(2616.040422)]

Rosenbrock 26.074928(1.363842)

[50000(0)]

25.827062(0.229061)

[50000(0)]

0.006032(0.007199)

[50000(0)]

Rotated hyper-

ellipsoid

21.042148(13.992104)

[50000(0)]

11.432795(12.574429)

[50000(0)]

0(0)

[35651.766667(13844.341490)]

Step 0(0)

[15368.333333(1789.600386)]

0(0)

[4113.333333(181.437428)]

0(0)

[2715.6 (642.727464)]

Rastrigin 157.344667(19.904304)

[50000(0)]

35.307758(24.740874)

[49608.333333(1386.019066)]

0(0)

[15443.5(2974.330811)]

Normalized Schwefel -387.100406(79.247509)

[50000(0)]

-293.658641(16.720209)

[50000(0)]

-372.784395(21.654014)

[50000(0)]

Griewank 0.002217(0.004811)

[38261.666667(6081.907643)]

0.001396(0.003811)

[15283.333333(13863.872259)]

0(0)

[15543.8(2585.328651)]

Salomon 0.237346(0.053333)

[50000(0)]

0.156540(0.050401)

[50000(0)]

0(0)

[49503.133333(2722.772937)]

Norwegian -0.778475(0)

[50000(0)]

-0.774357(0.000509)

[50000(0)]

-0.999785(0.000245)

[50000(0)]

Quartic function 0.013377(0.005454)

[50000(0)]

0.003246(0.000903)

[50000(0)]

0.000633(0.000541)

[50000(0)]

Fig. 1. Performance Comparison of DE, BBDE and FSDE when applied to selected functions

116 2009 IEEE Congress on Evolutionary Computation (CEC 2009)

Page 8: [IEEE 2009 IEEE Congress on Evolutionary Computation (CEC) - Trondheim, Norway (2009.05.18-2009.05.21)] 2009 IEEE Congress on Evolutionary Computation - Free Search Differential Evolution

APPENDIX

FSDE IN MATLAB

function [best_x, best_f, max_FEs, FDE_run] = FSDE(fun, D, N, LB, UB, G, opt_f)% fun is the function to be optimized% D is the function’s dimension% N is the population size% LB and UB are the lower and upper bound of the search space% G is the maximum number of iterations% opt_f is the global optimal solution of fun

%initialize the populationx = LB + rand(D,N) .* (UB - LB);

f = zeros(1,N);

%calculate fitnessfor j = 1:1:N f(j) = func(x(:,j), fun);end

% best locationsp_x = zeros(size(x));% best location fitnessp_f = zeros(size(f));% SenseS = zeros(size(f));

FSDE_run = zeros(1,G); %a FSDE runFEs = 0;max_FEs = 0;

% Initialize the best locationsp_x = x;p_f = f;

% Start FSDEfor g = 1:1:G

if max(p_f) == min(p_f) EQUAL = 1;

else EQUAL = 0;

end

if EQUAL == 0for j = 1:1:N

% Normalize p_f nf(j) = (p_f(j) - min(p_f))/(max(p_f) -min(p_f));

endend

for j = 1:1:N

if rand < g/G | EQUAL == 1 x(:,j) = p_x(:,j) + randn;

else

S(j) = rand;

while (true) k = ceil(N*rand());

if (nf(k) <= S(j))break;

end %ifend %while

% select 2 mutually different locationswhile (true)

i1 = ceil(N*rand()); i2 = ceil(N*rand());

if (i1 ~= k) & (i1 ~= i2) & (i2 ~= k)break;

end %if

end %while

w = rand;

while w == 0 w = rand;

end

% Create a new location

x(:,j) = p_x(:,k) + (x(:,i1) - x(:,i2)) * log(1/w);

end %if

%Check for constraint violationsfor i = 1:1:D

if x(i,j) > UB x(i,j) = UB;

end

if x(i,j) < LB x(i,j) = LB;

end

end %i

f(j) = func(x(:,j), fun);

FEs = FEs + 1;

if f(j) <= p_f(j) p_x(:,j) = x(:,j); p_f(j) = f(j);

end %if

if FEs >= (G*N)break;

end

end %j

if (abs(opt_f - min(f)) < 1e-6) & (max_FEs == 0) max_FEs = FEs;

end

FSDE_run(g) = min(p_f);

%find the worst location where bf is the fitness value of the worst

%location and b is its index [bf, b] = max(p_f);

%find the opposite of the worst location x(:,b) = LB + UB - rand*p_x(:,b); f(b) = func(x(:,b), fun);

FEs = FEs + 1;

if FEs >= (G*N) FDE_run(g+1:1:G) = min(p_f);

break;end

%update the worst locationif f(b) <= p_f(b)

p_x(:,b) = x(:,b); p_f(b) = f(b);

end %if

end %g

[best_f bi] = min(p_f);

best_x = p_x(:,bi);

if max_FEs == 0 max_FEs = FEs;end

end %FSDE

2009 IEEE Congress on Evolutionary Computation (CEC 2009) 117