Constrained Evolutionary Optimization Yong Wang Associate Professor, PhD School of Information...

Preview:

Citation preview

Constrained Evolutionary Optimization

Yong Wang Associate Professor, PhD

School of Information Science and Engineering,Central South University

ywang@csu.edu.cnhttp://ist.csu.edu.cn/YongWang.htm

The 1st Chinese Workshop on Evolutionary Computation and Learning

2

Constrained Optimization Problems

Constraint-handling Techniques

Solving Constrained Optimization Problems by

Evolutionary Algorithms

Dynamic Constrained Optimization Problems

Conclusion

Outline of My Talk

3

Constrained Optimization Problems

Constraint-handling Techniques

Solving Constrained Optimization Problems by

Evolutionary Algorithms

Dynamic Constrained Optimization Problems

Conclusion

Outline of My Talk

4

Constrained Optimization Problems (1/2)

• The constrained optimization problems (COPs) can be formulated as follows:

minimize

subject to

• The degree of constraint violation of an individual on the jth constraint is defined as:

• The degree of constraint violation of the individual :

)(xf

1( , , ) DDx x x S

0)( xg j

0)( xh j

x

x

inequality constraints

equality constraints

lj ,,1 =

1, ,j l m

max{0, ( )}, 1( )

max{0,| ( ) | }, 1j

jj

g x j lG x

h x l j m

1( ) ( )

m

jjG x G x

a positive tolerance value

for equality constraints

a positive tolerance value for equality constraints

5

Constrained Optimization Problems (2/2)

• An example

13 14 15 16 170

5

10

15

20

x

y

feasible region search space and feasible region

Remark: the purposes of solving COPs1) Approach the feasible region promptly2) Find the optimal solution

the optimal solution

6

Constrained Optimization Problems

Constraint-handling Techniques

Solving Constrained Optimization Problems by

Evolutionary Algorithms

Dynamic Constrained Optimization Problems

Conclusion

Outline of My Talk

7

Constraint-handling Techniques (1/2)

• Methods based on penalty functions

• Methods based on preference of feasible solutions over infeasible solutions

• Methods based on multiobjective optimization concepts

)(xf

)(xG handling objective function and

constraints separately

)(xf

)(xG

)(xf

mj j xGrxfxfitness 1 )()()(

)(xG

penalty factors

handling objective function and constraints simultaneously

8

Constraint-handling Techniques (2/2)

– The main aim of constrain-handling techniques is to determine the criterion to compare the individuals in the parent and offspring populations.

– The core of constraint-handling techniques is to make a tradeoff between objective function and constraint violation

Constraint Violation

Objective Function

)(xf

)(xG

9

Constrained Optimization Problems

Constraint-handling Techniques

Solving Constrained Optimization Problems by

Evolutionary Algorithms

Dynamic Constrained Optimization Problems

Conclusion

Outline of My Talk

10

Our Main Work

• We have developed several methods

– CW and CMODE

– HCOEA and DyHF

– ATM and (μ+λ)-CDE

11

CW (1/4)

• Motivation– The current constrain-handling techniques usually employ a

biased comparison criterion

• The main idea

Z. Cai and Y. Wang, “A multiobjective optimization-based evolutionary algorithm for constrained optimization.” IEEE Transactions on Evolutionary Computation, vol. 10, no. 6, pp. 658-675, 2006.

Multiobjective optimization techniques can be used to solve the transformed biobjective optimization problem

12

CW (2/4)

• The difference between and the general multi-objective optimization problems– would retrogress into a

single objective optimization problem within the feasible region (because in this case )

Graph representation of

( ) 0G x

f

Pareto Front

G 0

Global Optimum

Solid Segment Solid segment

Pareto front

Global optimum

0

f

G

(a)

13

CW (3/4)

“ . ” denotes the parent “^” denotes the offspring “o” denotes the nondominated individual in the offspring population

(e)

≤ ≤

<

Pareto dominates

( )af x

( )bf x

( )aG x

( )bG x

a bx x

(f)

CW (4/4)

• Archiving the Replacement

• The advantage of archiving and replacement

14

The infeasible solution with the lowest degree of constraint

violation at each generation

The infeasible solution with the lowest degree of constraint

violation at each generation

an individual x

ArcArc pop

-15-

CMODE (1/4)

• Motivation– CW includes some problem-dependent control parameters,

such as the population size and the expanding factor in simplex crossover

• The main ideas– Use differential evolution (DE) to generate new solutions – A novel infeasible solution replacement mechanism

Y. Wang and Z. Cai, “Combining multiobjective optimization with differential evolution to solve constrained optimization problems,” IEEE Transactions on Evolutionary Computation, vol. 16, no. 1, pp. 117-134, 2012.

CMODE (2/4)

• The infeasible solution replacement mechanism

16

the deterministic replacement the random replacement

pop

Arc

pop

Arc

Aim: enhance the feasibility and diversity of the population

simultaneously

Aim: enhance the quality and feasibility of the population

simultaneously

-17-

CMODE (3/4)

• the deterministic replacement– the strength value

– the rank value

– the rank value based on the degree of constraint violations

– the final fitness function

( )( ) #{ | }i j j t i js x x x P x x

1, ,i NP

1( )iR x

( )

1( ) ( )j t j i

i jx P x x

R x s x

1, ,i NP

2( )iR x

1 2

1 21, , 1, ,

( ) ( )( )

max ( ) max ( )i i

ij j

j NP j NP

R x R xF x

R x R x

1, ,i NP

( )is x

( )iF x

():

-18-

CMODE (4/4)

• An example of the deterministic replacement

These individuals will be replaced

according to equation (11) (7)

[2]

f

(0) [0]

0

)( 2x

f

)( 5x

f

(0) [9]

G

)( 3x

f

(13) [4]

(17) [7]

(6) [8]

(8) [5]

(0) [6]

(0) [3]

(0) [1]

)( 4x

f

)( 6x

f

)( 7x

f

)( 8x

f

)( 9x

f

)( 10x

f

)( 1x

f

these five individuals will be replaced

[]:

1( )iR x

2( )iR x

19

HCOEA (1/3)

• Motivation– COEAs can be generalized as constrain-handling techniques

plus EAs, i.e., a proper constraint-handling technique needs to be considered in conjunction with an appropriate search algorithm

• The main ideas– HCOEA adopts multiobjective optimization techniques to

handle constraints– HCOEA combines the global and local search models

Y. Wang, Z. Cai, G. Guo, and Y. Zhou, “Multiobjective optimization and hybrid evolutionary algorithm to solve constrained optimization problems.” IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics, vol. 37, no. 3, pp. 560-575, 2007.

20

HCOEA (2/3)

• The local search model

Schematic diagram to illustrate the local search model

HCOEA (3/3)

• The implementation of the local search model

21

the best infeasible individualone subpopulation

G

f

population G

f

a parent an offspring

Pareto dominancerandomly replace

-22-

DyHF (1/2)

• Motivation– At different stages of evolution, different probabilities for the

local and global search may be required to achieve the best performance.

• Main Idea– Makes use of differential evolution (DE) to generate the

offspring population during both the global and local search models

– Dynamically implement the global and local search models

Y. Wang and Z. Cai. “A dynamic hybrid framework for constrained evolutionary optimization,” IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics, vol. 42, no. 1, pp. 203-217, 2012.

-23-

DyHF (2/2)

• Dynamic implement the global and local search models

If rand<(NP-NF)/NP, rand is a uniformly distributed random number between 0 and 1, NF denotes the number of feasible solutions

Implement the local search model

Else

Implement the global search model

End If

24

ATM (1/5)

• Motivation– During the evolutionary process, the population may

inevitably experience the following three situations:• The infeasible situation: the population contains only infeasible

solutions• The semi-feasible situation: the population consists of a

combination of feasible and infeasible solutions• The feasible situation: the population is entirely composed of

feasible solutions

• Main idea– ATM (Adaptive Tradeoff Model) designs one tradeoff

strategy for one situation (divide and conquer)

Y. Wang, Z. Cai, Y. Zhou and W. Zeng, “An adaptive tradeoff model for constrained evolutionary optimization,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 80-93, 2008.

25

ATM (2/5)

• The tradeoff strategy for the infeasible situation

The left figure The right figure

26

ATM (3/5)

• The tradeoff strategy for the semi-feasible situation– The population Z is divided into the feasible group Z1 and the

infeasible group Z2, according to the feasibility of each individual:

– The best and worst feasible solutions are found by the following equations

– The converted objective function has the following form:

where is the feasibility proportion of the last population

27

ATM (4/5)

• The tradeoff strategy for the semi-feasible situation– Each objective function value is then normalized:

– Similarly, the constraint violations can be normalized according to:

– A final fitness function is obtained by adding the normalized objective function and constraint violations together:

28

ATM (5/5)

• The tradeoff strategy for the feasible situation– In this case, the comparisons of individuals are based only

on their objective function values, since the evolution of this phase is totally equivalent to that of unconstrained optimization.

(μ+λ)-CDE (1/5)

• Motivation– ATM uses a simple (μ,λ)-ES as the search engine

• Main idea– (μ+λ)-DE is adopted as the search engine– An improved ATM serves as the constraint-handling

technique

29

Y. Wang and Z. Cai, “Constrained evolutionary optimization by means of (μ+λ)-differential evolution and improved adaptive trade-off model.” Evolutionary Computation, vol. 19, no. 2, pp. 249-285, 2011.

(μ+λ)-CDE (2/5)

• (μ+λ)-DE

30

μ parents

λ offsprings

μ individualswith higher

quality

In order to enhance the search ability, current-to-best/1 has been improved in (μ+λ)-DE

(μ+λ)-CDE (3/5)

• We employ two criteria to compute the degree of constraint violation of an individual– The different constraint violations have largely different

scales

– The differences among the constraints may not be significant

31

max

1, ,( )max ( ( )), {1, , }j j i

iG G x j m

max

1( )

( ) , {1, ,( )}

m

j i jjnor i

G x GG x i

m

1( ) ( ), {1, ,( )}

m

i j ijG x G x i

(μ+λ)-CDE (4/5)

• Improved ATM: the infeasible situation

32

Pt

Qt

Ptemp

Pt+1

(μ+λ)-CDE (5/5)

• Improved ATM: the semi-feasible situation– The same with ATM except that the final fitness function is

obtained by the following equation:

• Improved ATM: the feasible situation– The same with ATM

33

( ) ( ) if the first criterion is used( )

( ) ( ) if the second criterion is usednor i i

final inor i nor i

f x G xf x

f x G x

34

Constrained Optimization Problems

Constraint-handling Techniques

Solving Constrained Optimization Problems by

Evolutionary Algorithms

Dynamic Constrained Optimization Problems

Conclusion

Outline of My Talk

Dynamic Constrained Optimization Problems (1/4)

• 56 practical dynamic optimization problems chosen from the papers published from 2006 to 2008 by Dr. T. T. Nguyen

35

T. T. Nguyen. Continuous dynamic optimisation using evolutionary algorithms. Ph.D. dissertation, School of Computer Science, University of Birmingham, 2011.

29 dynamic combinational applications 73% applications are dynamic constrained optimization problems

Dynamic Constrained Optimization Problems (2/4)

• 56 practical dynamic optimization problems chosen from the papers published from 2006 to 2008 by Dr. T. T. Nguyen

36

T. T. Nguyen. Continuous dynamic optimisation using evolutionary algorithms. Ph.D. dissertation, School of Computer Science, University of Birmingham, 2011.

27 dynamic continuous applications 74% applications are dynamic constrained optimization problems

Dynamic Constrained Optimization Problems (3/4)

• The dynamic constrained optimization Problems (DCOPs) can be formulated as follows:

• The characteristics of DCOPs– Dynamic unconstrained optimization– Static constrained optimization

37

min ( , )

( , ) 0, 1, ,

( , ) 0, 1, ,

x S

i

j

f x t

g x t i l

h x t j m

objective function

inequality constraint

equality constraint

Dynamic Constrained Optimization Problems (4/4)

• DCOPs can be divided into three categories– Objective function is dynamic and constraints are static

– Objective function is static and constraints are dynamic

– Both objective function and constraints are dynamic

38

Evolutionary Algorithms for DCOPs (1/4)

• Evolutionary algorithms (EAs) for dynamic unconstrained optimization in the last twenty years– Introducing/Maintaining diversity

– Memory

– Multipopulation

– …

• Evolutionary algorithms (EAs) for statics constrained optimization in the last twenty years– Penalty

– Preferring feasible solutions to infeasible solutions

– Multiobjectivization

– …

39

Evolutionary Algorithms for DCOPs (2/4)

• However, very few attempts have been made to investigate EAs for DCOPs (nearly 10 papers in Journals and Conferences)

• Therefore, solving DCOPs by EAs is in its infant stage

• Maybe, it will become one of the hot topics in evolutionary computation community rapidly

40

T. T. Nguyen and X. Yao, Continuous dynamic constrained optimization—The challenges, IEEE Transactions on Evolutionary Computation, vol. 16, no. 6, 769-786, 2012.

Evolutionary Algorithms for DCOPs (3/4)

• The hypothesis: the change of the environment is not totally random

• The main aim of solving DCOPs by EAs– To find the feasible optimal solution in the current environment as

soon as possible

– To track the feasible optimal solution in the next environment

41

Evolutionary Algorithms for DCOPs (4/4)

• The issues when solving DCOPs by EAs– Test functions

The current test functions are too simple and not scalable

– Change detection mechanisms Only the best individual is used for change detection

– Approaches About three simple approaches

– Performance indicators Online/offline error

42

43

Constrained Optimization Problems

Constraint-handling Techniques

Solving Constrained Optimization Problems by

Evolutionary Algorithms

Dynamic Constrained Optimization Problems

Conclusion

Outline of My Talk

Conclusion

• We have proposed several methods for solving constrained optimization problems

• We have proposed a set of dynamic constrained optimization test functions

44

Recommended