39
Self-Adaptive Semi- Autonomous Parent Selection (SASAPAS) Each individual has an evolving mate selection function Two ways to pair individuals: Democratic approach Dictatorial approach

S elf- A daptive S emi- A utonomous Pa rent S election ( SASAPAS )

Embed Size (px)

DESCRIPTION

S elf- A daptive S emi- A utonomous Pa rent S election ( SASAPAS ). Each individual has an evolving mate selection function Two ways to pair individuals: Democratic approach Dictatorial approach. Democratic Approach. Democratic Approach. Dictatorial Approach. - PowerPoint PPT Presentation

Citation preview

Self-Adaptive Semi-Autonomous Parent Selection (SASAPAS)

• Each individual has an evolving mate selection function

• Two ways to pair individuals:– Democratic approach– Dictatorial approach

Democratic Approach

Democratic Approach

Dictatorial Approach

Self-Adaptive Semi-Autonomous Dictatorial Parent Selection

(SASADIPS)• Each individual has an evolving mate

selection function• First parent selected in a traditional manner• Second parent selected by first parent –the

dictator – using its mate selection function

Mate selection function representation

• Expression tree as in GP• Set of primitives – pre-built selection

methods

Mate selection function evolution• Let F be a fitness function defined on a

candidate solution. Letimprovement(x) = F(x) – max{F(p1),F(p2)}

• Max fitness plot; slope at generation i is s(gi)

Mate selection function evolution

• IF improvement(offspring)>s(gi-1)– Copy first parent’s mate selection function

(single parent inheritance)• Otherwise

– Recombine the two parents’ mate selection functions using standard GP crossover(multi-parent inheritance)

– Apply a mutation chance to the offspring’s mate selection function

Experiments• Counting ones• 4-bit deceptive trap

– If 4 ones => fitness = 8– If 3 ones => fitness = 0– If 2ones => fitness = 1– If 1 one => fitness = 2– If 0 ones => fitness = 3

• SAT

Counting ones results

Highly evolved mate selection function

SAT results

4-bit deceptive trap results

SASADIPS shortcomings• Steep fitness increase in the early generations

may lead to premature convergence to suboptimal solutions

• Good mate selection functions hard to find• Provided mate selection primitives may be

insufficient to build a good mate selection function

• New parameters were introduced• Only semi-autonomous

Greedy Population Sizing(GPS)

|P1| = 2|P0| …

|Pi+1| = 2|Pi|

The parameter-less GA

P0 P1 P2

Evolve an unbounded number of populations in parallel

Smaller populations are given more fitness evaluations

Fitn

ess

eval

s

Terminate smaller pop. whose avg. fitness is exceeded by a larger pop.

Greedy Population Sizing

P0 P1 P2 P3 P4 P5

F1

F2

F3

F4

Evolve exactly two populations in parallel

Equal number of fitness evals. per population

Fitness evals

GPS-EA vs. parameter-less GA

F1

F2

F3

F4

NN

F1

2F1

F2

2F2

F3

2F3

F4

2F4

2F1 + 2F2 + … + 2Fk + 3N

N

2N

F1 + F2 + … + Fk + 2N

N

Parameter-less GA

GPS-EA

GPS-EA vs. the parameter-less GA, OPS-EA and TGA

80

85

90

95

100

100 500 1000

problem size

MB

F%

of m

axim

um fi

tnes

s

OPS-EA GPS-EA

TGA parameter-less GA

80

85

90

95

100

100 500 1000

problem size

best

sol

utio

n fo

und

% o

f max

imum

fitn

ess

OPS-EA GPS-EATGA parameter-less GA

• GPS-EA < parameter-less GA• TGA < GPS-EA < OPS-EA

GPS-EA finds overall bettersolutions than parameter-less GA

Deceptive Problem

Limiting Cases

0

20

40

60

80

1 2 3 4 5 6 7 8 9 10 11

Fitness Evals

Avg

. P

op

. F

itn

ess

P3 P4

0

20

40

60

80

100

100 500 1000

problem size

% o

f ru

ns

limiting cases non-limiting cases

• Favg(Pi+1)<Favg(Pi)• No larger populations are created• No fitness improvements until

termination

• Approx. 30% - limiting cases• Large std. dev., but lower MBF• Automatic detection of the limiting cases is needed

GPS-EA Summary

• Advantages– Automated population size control– Finds high quality solutions

• Problems– Limiting cases– Restart of evolution each time

Estimated Learning Offspring Optimizing

Mate Selection(ELOOMS)

Traditional Mate Selection

25 3 8 2 4 5

MATES

5 8

5 4

• t – tournament selection• t is user-specified

ELOOMS

NOYES YES MATESYES

NOYES

YES

Mate Acceptance Chance (MAC)

j How much do I like ?

k

b1 b2 b3 … bL

(1 )

1

(1 ) ( 1)( , )

i

Lb

i ii

b dMAC j k

L

d1 d2 d3 … dL

Desired Features

j

d1 d2 d3 … dL

# times past mates’ bi = 1 was used to produce fit offspring

# times past mates’ bi was used to produce offspring

b1 b2 b3 … bL

• Build a model of desired potential mate• Update the model for each encountered mate• Similar to Estimation of Distribution Algorithms

ELOOMS vs. TGA

L=500With Mutation

L=1000With Mutation

Easy Problem

ELOOMS vs. TGA

Without Mutation With Mutation

Deceptive ProblemL=100

Why ELOOMS works on Deceptive Problem

• More likely to preserve optimal structure• 1111 0000 will equally like:

– 1111 1000– 1111 1100– 1111 1110

• But will dislike individuals not of the form:– 1111 xxxx

Why ELOOMS does not work as well on Easy Problem

• High fitness – short distance to optimal• Mating with high fitness individuals –

closer to optimal offspring• Fitness – good measure of good mate• ELOOMS – approximate measure of

good mate

ELOOMS computational overhead

• L – solution length• μ – population size• T – avg # mates evaluated per individual• Update stage:

– 6L additions• Mate selection stage:

– 2L*T* μ additions

ELOOMS Summary

• Advantages– Autonomous mate pairing– Improved performance (some cases)– Natural termination condition

• Disadvantages– Relies on competition selection pressure– Computational overhead can be significant

GPS-EA + ELOOMS Hybrid

Expiration of population Pi

• If Favg(Pi+1) < Favg(Pi)– Limiting cases possible

• If no mate pairs in Pi (ELOOMS)– Detection of the limiting cases

0

20

40

60

80

100

100 500 1000

problem size

% o

f ru

ns

limiting cases non-limiting cases

0

20

40

60

80

1 2 3 4 5 6 7 8 9 10 11

Fitness Evals

Avg

. P

op

. F

itn

ess

P3 P4

Comparing the Algorithms

Without Mutation With Mutation

Deceptive ProblemL=100

GPS-EA + ELOOMS vs. parameter-less GA and TGA

Without Mutation With MutationDeceptive Problem

L=100

GPS-EA + ELOOMS vs. parameter-less GA and TGA

Without Mutation With MutationEasy Problem

L=500

GPS-EA + ELOOMS Summary

• Advantages– No population size tuning– No parent selection pressure tuning– No limiting cases– Superior performance on deceptive problem

• Disadvantages– Reduced performance on easy problem– Relies on competition selection pressure

NC-LAB’s current AutoEA research• Make λ a dynamic derived variable by self-

adapting each individual’s desired offspring size• Promote “birth control” by penalizing fitness

based on “child support” and use fitness based survival selection

• Make μ a dynamic derived variable by giving each individual its own survival chance

• Make individuals mortal by having them age and making an individual’s survival chance dependent on its age as well as its fitness