26
Iterative Improvement Algorithm 2012/03/20

Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Embed Size (px)

Citation preview

Page 1: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Iterative Improvement Algorithm

2012/03/20

Page 2: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Outline

• Local Search Algorithms• Hill-Climbing Search• Simulated Annealing Search• Local Beam Search• Genetic Algorithms

Page 3: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Iterative Improvement Algorithm

• General idea of local search– start with a complete configuration– make modification to improve its quality

• Domains: e.g., 8-queens, VLSI layout, etc.– state description solution– path to a solution is irrelevant

• Approaches– Hill-Climbing (f quality)– Gradient Descent (f cost)– Simulated Annealing

Page 4: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Local Search Algorithm and Optimization Problems

• Uninformed search– Looking for a solution where solution is a path from start to goal– At each intermediate point along a path, we have no prediction

of value of path

• Informed search– Again, looking for a path from start to goal– This time, we have insight regarding the value of intermediate

solutions

Page 5: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Local Search Algorithm and Optimization Problems (cont.)

• What if the path is not important, just the goal?– So the goal is unknown– The path to the goal need not be solved– State space = set of complete configuration– Find configurations satisfying constraints

• Examples– What quantities of quarters, nickels, and dimes add up to

$17.45 and minimize the total number of coins?– 8-Queen problem

Page 6: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Local Search Algorithm

• Local search does not keep track of previous solutions– It operates using a single current state (rather than multiple

paths) and generally move only to neighbors of that state

• Advantages– Use a small amount of memory (usually constant amount)– They can often find reasonable (not we are not saying optimal)

solutions in infinite search space

Page 7: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Optimization Problems

• To find the best state according to an Objective Function

• Example– f(q, d, n) = 1,000,000 if q*0.25 + d*0.1 + n*0.05

17.45= q + d + n otherwise

– To minimize f

Page 8: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Looking for Global Maximum (or Minimum)

currentstate

local maximum

“flat” local maximum

global maximumobjective function

state space

shoulder

Page 9: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Hill-Climbing Search

• “Like climbing Everest in thick fog with amnesia”• Only record the state and its evaluation instead of

maintaining a search tree

function HILL-CLIMBING ( problem ) returns a state that is a local maximum inputs: problem, a problem local variables: current, a node

neighbor, a node

current MAKE-NODE( INITIAL-STATE[ problem ]) loop do neighbor a highest-valued successor of current if VALUE[ neighbor ] VALUE[ current ] then return STATE[ current ] current neighbor

Page 10: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Hill-Climbing Search (cont.-1)

• Variations– choose any successor with a higher value than current– choose value[next] value[current]

• Problems– Local Maxima: search halts prematurely– Plateaux: search conducts a random walk– Ridges: search oscillates with slow progress

• Solution: Random-Restart Hill-Climbing– start from randomly generated initial states– saving the best result so far– finding the optimal solution eventually if enough iterations are

allowed

Page 11: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Hill-Climbing Search (cont.-2)

• Creating a sequence of local maximumthat are not directly connected to each other

• From each local maximum,all the available actions point downhill

Page 12: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Hill-Climbing Search (cont.-3)

• 8-queens problem

– successor function =all states generated bymoving a single queento another square in the same column

• 8 7 successors

– h = # of pairs of queens that are attacking each other, either directly or indirectly

– h = 17 for the above state

Page 13: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Hill-Climbing Search (cont.-4)

• 8-queens problem

– a local minimum with h = 1– every successor has a higher cost

ba c d e f g h

8

7

6

54

3

2

1

Page 14: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

The K-Means Algorithm

1. Choose a value for K, the total number of clusters.2. Randomly choose K points as cluster centers.

3. Assign the remaining instances to their closest cluster center.

4. Calculate a new cluster center for each cluster.5. Repeat steps 3-5 until the cluster centers do not

change.

Page 15: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

15

Table 3.6 • K-Means Input Values

Instance X Y

1 1.0 1.52 1.0 4.53 2.0 1.54 2.0 3.55 3.0 2.56 5.0 6.0

Page 16: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

16

0

1

2

3

4

5

6

7

0 1 2 3 4 5 6

f(x)

x

Page 17: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

17

Table 3.7 • Several Applications of the K-Means Algorithm (K = 2)

Outcome Cluster Centers Cluster Points Squared Error

1 (2.67,4.67) 2, 4, 614.50

(2.00,1.83) 1, 3, 5

2 (1.5,1.5) 1, 315.94

(2.75,4.125) 2, 4, 5, 6

3 (1.8,2.7) 1, 2, 3, 4, 59.60

(5,6) 6

Page 18: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

18

0

1

2

3

4

5

6

7

0 1 2 3 4 5 6

x

f(x)

Page 19: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

General Considerations

Requires real-valued data. We must select the number of clusters present in

the data. Works best when the clusters in the data are of approximately equal size. Attribute significance cannot be determined. Lacks explanation capabilities.

Page 20: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Simulated Annealing

• idea: escape local maxima by allowing some bad moves but gradually decrease their frequency

function SIMULATED-ANNEALING ( problem, schedule ) returns a solution state inputs: problem, a problem

schedule, a mapping from time to “temperature” local variables: current, next, a node

T, a “temperature” controlling the probability of downward steps

current MAKE-NODE( INITIAL-STATE[ problem ]) for t 1 to do T schedule[ t ] if T = 0 then return current next a randomly selected successor of current E VALUE[ next ] - VALUE[ current ] if E > 0 then current next else current next only with probability e

E / T

Page 21: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Simulated Annealing (cont.-1)

• A term borrowed from metalworking• We want metal molecules to find a stable location

relative to neighbors• Heating causes metal molecules to move around and to

take on undesirable locations• During cooling, molecules reduce their movement and

settle into a more stable position• Annealing is process of heating metal and letting it cool

slowly to lock in the stable locations of the molecules

Page 22: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Simulated Annealing (cont.-2)

• Select a random move at each iteration• Move to the selected node if it is better than the current

node• The probability of moving to a worse node decreases

exponentially with the badness of the move,i.e., eΔE/T

• The temperature T changes according to a schedule

Page 23: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Property of Simulated Annealing

• One can prove: If T decreases slowly enough, then simulated annealing search will find a global optimum with probability approaching 1

• Wildly used in VLSI layout, airline scheduling, etc.

Page 24: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Local Beam Search

• Keep track of k states rather than just one• Begin with k randomly generated states• At each iteration, all the successors of all k states are

generated• If any one is a goal state, halt; else select the k best

successors from the complete list and repeat• Useful information is passed among the k parallel search

thread

Page 25: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Genetic Algorithm (GA)

• A successor state is generated by combining two parent states

• Start with k randomly generated states (population)• A state is represented as a string over a finite alphabet

(often a string of 0s and 1s)• Evaluation function (fitness function). Higher values for

better states• Produce the next generation of states by selection,

crossover, and mutation

Page 26: Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic

Genetic Algorithm (cont.)

Fitness function: # of non-attacking pair of queens(min = 0, max = 8×7/2 = 28)

Probability for selected for rep 24/(24+23+20+11) = 31% 23/(24+23+20+11) = 29%, etc

Initial Population Fitness Fn Selection Crossover Mutation

3 2 7 5 2 4 1 1 2 4 7 4 8 5 5 2 3 2 7 4 8 5 5 2