21
1 Fitness Inheritance in BOA Fitness Inheritance in BOA Martin Pelikan Martin Pelikan Dept. of Math and CS Dept. of Math and CS Univ. of Missouri at St. Louis Univ. of Missouri at St. Louis Kumara Kumara Sastry Sastry Illinois GA Lab Illinois GA Lab Univ. of Illinois at Urbana Univ. of Illinois at Urbana - - Champaign Champaign

Fitness inheritance in the Bayesian optimization algorithm

Embed Size (px)

DESCRIPTION

This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions, but also for estimating their fitness. The results indicate that fitness inheritance is a promising concept in BOA, because population-sizing requirements for building appropriate models of promising solutions lead to good fitness estimates even if only a small proportion of candidate solutions is evaluated using the actual fitness function. This can lead to a reduction of the number of actual fitness evaluations by a factor of 30 or more.

Citation preview

Page 1: Fitness inheritance in the Bayesian optimization algorithm

1

Fitness Inheritance in BOAFitness Inheritance in BOA

Martin PelikanMartin PelikanDept. of Math and CSDept. of Math and CS

Univ. of Missouri at St. LouisUniv. of Missouri at St. Louis

Kumara Kumara SastrySastryIllinois GA LabIllinois GA Lab

Univ. of Illinois at UrbanaUniv. of Illinois at Urbana--ChampaignChampaign

Page 2: Fitness inheritance in the Bayesian optimization algorithm

2

MotivationMotivation

Bayesian optimization algorithm (BOA)Bayesian optimization algorithm (BOA)Scales up on decomposable problemsScales up on decomposable problemsO(O(nn))--O(O(nn22) evaluations until convergence) evaluations until convergence

Expensive evaluationsExpensive evaluationsRealReal--world evaluations can be complexworld evaluations can be complexFEA, simulation, …FEA, simulation, …O(O(nn22) is often not enough) is often not enough

This paperThis paperExtend probabilistic model to include fitness infoExtend probabilistic model to include fitness infoUse model to evaluate part of the populationUse model to evaluate part of the population

Page 3: Fitness inheritance in the Bayesian optimization algorithm

3

OutlineOutline

BOA basicsBOA basicsFitness inheritance in BOAFitness inheritance in BOA

Extend Bayesian networks with fitness.Extend Bayesian networks with fitness.Use extended model for evaluation.Use extended model for evaluation.

ExperimentsExperimentsFuture workFuture workSummary and conclusionsSummary and conclusions

Page 4: Fitness inheritance in the Bayesian optimization algorithm

4

Bayesian Optimization Bayesian Optimization AlgAlg. (BOA). (BOA)

Pelikan, Goldberg, and CantuPelikan, Goldberg, and Cantu--Paz (1998)Paz (1998)Similar to genetic algorithms (Similar to genetic algorithms (GAsGAs))Replace mutation + crossover byReplace mutation + crossover by

Build Bayesian network to model selected solutions.Build Bayesian network to model selected solutions.Sample Bayesian network to generate new candidate Sample Bayesian network to generate new candidate solutions.solutions.

Page 5: Fitness inheritance in the Bayesian optimization algorithm

5

BOABOA

Current population Selection

New population

Bayesian network

Restricted tournament replacement

Page 6: Fitness inheritance in the Bayesian optimization algorithm

6

Bayesian Networks (Bayesian Networks (BNsBNs))

2 components2 componentsStructureStructure

directed acyclic graphdirected acyclic graphnodes = variables (string positions)nodes = variables (string positions)Edges = dependencies between variablesEdges = dependencies between variables

ParametersParametersConditional probabilities Conditional probabilities p(X|Pp(X|Pxx), where ), where

X is a variableX is a variablePPxx are parents of X (variables that X depends on)are parents of X (variables that X depends on)

Page 7: Fitness inheritance in the Bayesian optimization algorithm

7

BN exampleBN example

A

B C

BB p(Bp(B))00 0.250.2511 0.750.75

AA BB p(A|Bp(A|B))00 00 0.100.1000 11 0.600.6011 00 0.900.9011 11 0.400.40

CC AA p(C|Ap(C|A))00 00 0.800.8000 11 0.550.5511 00 0.200.2011 11 0.450.45

Page 8: Fitness inheritance in the Bayesian optimization algorithm

8

Extending Extending BNsBNs with fitness infowith fitness info

Basic ideaBasic ideaDon’t work only with Don’t work only with conditional probabilitiesconditional probabilitiesAdd also fitness info for Add also fitness info for fitness estimationfitness estimation

AA BB p(A|B)p(A|B) f(A|B)f(A|B)00 00 0.100.10 --0.50.500 11 0.600.60 0.50.511 00 0.900.90 0.30.311 11 0.400.40 --0.30.3

Fitness info attached to Fitness info attached to p(X|Pp(X|Pxx) denoted by ) denoted by f(X|Pf(X|Pxx))Contribution of X restricted by Contribution of X restricted by PPxx

avg. fitness of solutions with X=x and avg. fitness of solutions with X=x and PPxx==ppxx

avg. fitness of solutions with avg. fitness of solutions with PPxx

f X = x | Px = px( )= f X = x,Px = px( )− f Px = px( )f X = x,Px = px( )

f Px = px( )

Page 9: Fitness inheritance in the Bayesian optimization algorithm

9

Estimating fitnessEstimating fitness

EquationEquation

In wordsIn wordsFitness = avg. fitness + avg. contribution of each bitFitness = avg. fitness + avg. contribution of each bitAvg. contributions taken w.r.t. context from BNAvg. contributions taken w.r.t. context from BN

f X1, X2 ,K , Xn( )= favg + f Xi | PXi( )

i=0

n

Page 10: Fitness inheritance in the Bayesian optimization algorithm

10

BNsBNs with decision treeswith decision trees

Local structures in Local structures in BNsBNsMore efficient representation for p ( X | More efficient representation for p ( X | PPxx ))

Example for p ( A | B C )Example for p ( A | B C )

( )0,1| == CBAp

B

C

0 1

10

( )1,1| == CBAp

( )0| =BAp

Page 11: Fitness inheritance in the Bayesian optimization algorithm

11

BNsBNs with decision trees + fitnesswith decision trees + fitness

Same ideaSame ideaAttach fitness info to Attach fitness info to each probabilityeach probability

)0|( =BAf

)0,1|( == CBAp

B

C

0 1

10

)1,1|( == CBAp

)0|( =BAp

)0,1|( == CBAf )1,1|( == CBAf

Page 12: Fitness inheritance in the Bayesian optimization algorithm

12

Estimating fitness againEstimating fitness again

Same as before…because both Same as before…because both BNsBNs represent the samerepresent the sameEquationEquation

In wordsIn wordsFitness = avg. fitness + avg. contribution of each bitFitness = avg. fitness + avg. contribution of each bitAvg. contributions taken Avg. contributions taken w.r.tw.r.t. context from BN. context from BN

( )∑=

+=n

iXiavgn i

PXffXXXf0

21 |),,,( K

Page 13: Fitness inheritance in the Bayesian optimization algorithm

13

Where to learn fitness from?Where to learn fitness from?

Evaluate entire initial populationEvaluate entire initial populationChoose inheritance proportion, pChoose inheritance proportion, pii

After thatAfter thatEvaluate (1Evaluate (1--ppii) proportion of offspring) proportion of offspringUse evaluated parents + evaluated offspring to learnUse evaluated parents + evaluated offspring to learnEstimate fitness of the remaining prop. pEstimate fitness of the remaining prop. pii

Sample for learning: N(1Sample for learning: N(1--ppii) to N+N(1) to N+N(1--ppii))Often, 2N(1Often, 2N(1--ppii))

Page 14: Fitness inheritance in the Bayesian optimization algorithm

14

Simple example: Simple example: OnemaxOnemax

OnemaxOnemax

What happens?What happens?Average fitness grows (as predicted by theory)Average fitness grows (as predicted by theory)No context is necessaryNo context is necessaryFitness contributions stay constantFitness contributions stay constant

( ) ∑=

=n

iin XXXXf

121 ,,, K

( )( ) 5.00

5.01−==+==

i

i

XfXf

Page 15: Fitness inheritance in the Bayesian optimization algorithm

15

ExperimentsExperiments

ProblemsProblems5050--bit bit onemaxonemax10 traps of order 410 traps of order 410 traps of order 510 traps of order 5

SettingsSettingsInheritance proportion from 0 to 0.999Inheritance proportion from 0 to 0.999Minimum population size for reliable convergenceMinimum population size for reliable convergenceMany runs for each setting (300 runs for each setting)Many runs for each setting (300 runs for each setting)

OutputOutputSpeedSpeed--up (in terms of up (in terms of realreal fitness evaluations)fitness evaluations)

Page 16: Fitness inheritance in the Bayesian optimization algorithm

16

OnemaxOnemax

0 0.2 0.4 0.6 0.8 10

5

10

15

20

25

30

35

Proportion inherited

Spe

ed-u

p (w

.r.t.

no in

herit

ance

)

Page 17: Fitness inheritance in the Bayesian optimization algorithm

17

TrapTrap--44

0 0.2 0.4 0.6 0.8 10

5

10

15

20

25

30

35

Proportion inherited

Spe

ed-u

p (w

.r.t.

no in

herit

ance

)

Page 18: Fitness inheritance in the Bayesian optimization algorithm

18

TrapTrap--55

0 0.2 0.4 0.6 0.8 10

10

20

30

40

50

60

Proportion inherited

Spe

ed-u

p (w

.r.t.

no in

herit

ance

)

Page 19: Fitness inheritance in the Bayesian optimization algorithm

19

DiscussionDiscussion

Inheritance proportionInheritance proportionHigh proportions of inheritance work great.High proportions of inheritance work great.

SpeedSpeed--upupOptimal speedOptimal speed--up of 30up of 30--5353High speedHigh speed--up for almost any settingup for almost any settingThe tougher the problem, the better the speedThe tougher the problem, the better the speed--upup

Why so good?Why so good?Learning probabilistic model difficult, so accurate Learning probabilistic model difficult, so accurate fitness info can be added at not much extra cost.fitness info can be added at not much extra cost.

Page 20: Fitness inheritance in the Bayesian optimization algorithm

20

ConclusionsConclusions

Fitness inheritance works great in BOAFitness inheritance works great in BOATheory now exists (Theory now exists (SatrySatry et al., 2004) that et al., 2004) that explains these resultsexplains these resultsHigh proportions of inheritance lead to high High proportions of inheritance lead to high speedspeed--upsupsChallenging problems allow much speedChallenging problems allow much speed--upupUseful for practitioners with computationally Useful for practitioners with computationally complex fitness functioncomplex fitness function

Page 21: Fitness inheritance in the Bayesian optimization algorithm

21

ContactContact

Martin Martin PelikanPelikanDept. of Math and Computer Science, 320 CCBDept. of Math and Computer Science, 320 CCBUniversity of Missouri at St. LouisUniversity of Missouri at St. Louis8001 Natural Bridge Rd.8001 Natural Bridge Rd.St. Louis, MO 63121St. Louis, MO 63121

EE--mail: mail: [email protected]@cs.umsl.edu

WWW:WWW: http://http://www.cs.umsl.edu/~pelikanwww.cs.umsl.edu/~pelikan//