Upload
kristina-palmer
View
222
Download
0
Embed Size (px)
DESCRIPTION
Problem Statement Find continuous functions x*= h i (y d ) Such that, J=g(x*) is local minimum y d =f(x*) over an interval of y d. Assumptions Compact set in x g & f are C 1, deterministic and time invariant Change in x is easy to implement Adequately scaled Regions larger than a point where ∇ f=0 can result in open domain of h i 3 MethodExamples ConclusionIntroduction Monday, June 6 th, 2011CEC2011-#274
Citation preview
Population Based Optimization for Variable Operating Points
Alan L. Jennings & Raúl Ordóñez, ajennings1, [email protected]
Electrical and Computer Engineering University of Dayton
Method Examples ConclusionIntroductionMonday, June 6th, 2011 CEC2011-#274 1
The Challenge
As a desired parameter changes,
Smoothly change other parameters in real-time,
While maintaining local optimality.
The SolutionGlobal search/optimization of input space,
To form inverse functions in the output space,
Using particles and clusters.
2
Method Examples ConclusionIntroduction
Change YOne dimension
Adjust XMany dimensions
Smoothly and quickly Maintain Optimality
0 0.5 1-2
0
2
4
y
x1
x2
x3
x4
-1 -0.5 0 0.5 10
0.5
1
x1
x 2
Output, y
-1 0 1-1
-0.5
0
0.5
1
1.5
x1
x 2
Cost, J
-1 0 1-1
-0.5
0
0.5
1
1.5
-1 0 1-1
-0.5
0
0.5
1
1.5
x1
x 2
Output/Cost Overlay
-1 0 1-1
-0.5
0
0.5
1
1.5Step: 1 Particles: 60 Clusters: 0
x1
x 2
-1 0 1-1
-0.5
0
0.5
1
1.5Final Clusters: 11
x1
x 2
Monday, June 6th, 2011 CEC2011-#274
Problem Statement
• Find continuous functions x*= hi(yd)
Such that, • J=g(x*) is local minimum• yd=f(x*)
over an interval of yd.
Assumptions• Compact set in x• g & f are C1, deterministic and time invariant• Change in x is easy to implement• Adequately scaled• Regions larger than a point where
f=0 can result in open domain of h∇ i
3
Method Examples ConclusionIntroduction
-10
1
-10
1
-1
0
1
x1
Output, y
x2-1
01
-10
1-2
-1
0
x1
Cost, J
x2
-1 0 1-1
-0.5
0
0.5
1
1.5Final Clusters: 11
x1
x 20.2 0.4 0.6 0.8-1
0
1
Optimal Inverse Functions, hi
x
0.2 0.4 0.6 0.8
-0.6-0.4-0.2
yd
J
Monday, June 6th, 2011 CEC2011-#274
Example Problems
• Thermostat• Combining generators
• Optimal control trajectoryLinear, SISO system
4
Method Examples ConclusionIntroduction
Input: Set point Output: Temperature
Cost: Energy
-1 0 1-1
0
1
Monday, June 6th, 2011 CEC2011-#274
Input: Servo positions nodes in timeOutput: Crawl distance, Jump height, ….Cost: Energy, Max Torque, Profile Height,…
This method is different from • Nominal operating point• Pareto-Optimal Front
Method Overview
• Neural Networks Universal Approximators Converge in the gradient Simple to get gradient• Swarm Optimization Agents cover n space Simple motions Allow for clusters• Spline Interpolation Use known optimal points of clusters
5
Method Examples ConclusionIntroduction
Surrogate Function Creation
Sample Function
Train Network
Validate Network
Swarm Optimization
Initialize Population
Move Agents:lower g(x), keep f(x)
Check for removal/ settling conditions
Form clusters
Execution
Select cluster hi
Get yd
Evaluate hi
Move x to x*
Swarm Optimization
Execution
Monday, June 6th, 2011 CEC2011-#274
Particle Motion
• Output gradient Move in null space• Cost gradient Move opposite (in null space)• Saturation All gradients saturate If gradients are large -> fixed step length If a gradient is small -> step size diminishes• Boundary constraint reduces step length• Minimum step for settling• Remove particles close to another Quickly reduces population size
6
Method Examples ConclusionIntroduction
-1 -0.9 -0.8 -0.7
1.1
1.15
1.2
1.25
1.3
1.35
1.4
1.45
1.5
Output Cost
Step
Monday, June 6th, 2011 CEC2011-#274
Cluster Formation
• Form cluster from settled particle • Ascend/Descend Output Form new point Apply gradient descent• End Cluster conditions Particle doesn’t settle Output decreases / increases Settles too far away / close
7
Method Examples ConclusionIntroduction
-1 -0.9 -0.8 -0.7
1.1
1.15
1.2
1.25
1.3
1.35
1.4
1.45
1.5
Monday, June 6th, 2011 CEC2011-#274
Linear Output, Quadratic Cost
x1
x 2
0 10 20 30 40 50 60 70 80 900
10
20
30
40
50
60
70
Simple Example
• Combination of generators Output: Total power out Cost: Quadratic function• Expected result Each does half the load
8
Method Examples ConclusionIntroduction
0 10 20 30 40 50 60 70 80 900
10
20
30
40
50
60
70Swarm, Step: 1 Number of particles: 60 Number of clusters: 0
x1
x 2
20 40 60 80 1000
20
40
60
80
Optimal Inverse Functions, hi
x
20 40 60 80 100
0
20
40
60
yd
J
0 10 20 30 40 50 60 70 80 900
10
20
30
40
50
60
70Swarm, Step: 49 Number of particles: 0 Number of clusters: 1
x1
x 2
Monday, June 6th, 2011 CEC2011-#274
Complex Examples
• Combination of functions Multiple extremum Saddle points 2-dim for verification• Expected result Clusters between output extremum
9
Method Examples ConclusionIntroduction
x1
x 2
Peaks
-1 0 1-1
0
1
x1
x 2
x12 (x2-c1)2
-1 0 1-1
0
1
x1
x 2
(x1-c1) x22
-1 0 1-1
0
1
x1
x 2
x1 sin(c1 x2)
-1 0 1-1
0
1
Quadratic Cost
-1
0
1
Optimal Inverse Functions, hi
x 1
-1
0
1
x 2
-1 -0.5 0 0.5 1012
yd
J
-1 0 1-1
-0.5
0
0.5
1
1.5Final Clusters: 5
x1
x 2
-1
0
1
Optimal Inverse Functions, hi
x 1
-1
0
1
x 2
-0.5 0 0.5 1-2
0
2
yd
J
-1 0 1-1
-0.5
0
0.5
1
1.5Final Clusters: 11
x1
x 2
Linear/Quadratic Cost
-1
0
1
Optimal Inverse Functions, hi
x 1-1
0
1
x 2
-0.5 0 0.5 1-2
0
2
yd
J
-1 0 1-1
-0.5
0
0.5
1
1.5Final Clusters: 9
x1
x 2
Periodic Cost
-1
0
1
Optimal Inverse Functions, hi
x 1
-1
0
1
x 2
-0.5 0 0.5 1012
yd
J
-1 0 1-1
-0.5
0
0.5
1
1.5Final Clusters: 7
x1
x 2
Quadratic Cost
Monday, June 6th, 2011 CEC2011-#274
Cluster Evaluation
• Verify Output Accuracy Plot Actual vs Desired for test points• Verify Optimality Generate neighbors Plot cost vs output Subtract expected cost
10
Method Examples ConclusionIntroduction
-1 0 1-1
-0.5
0
0.5
1
1.5Final Clusters: 5
x1
x 2Cluster 1 Test
Cluster 2 Test
-1
0
1Optimal Inverse Functions, hi
x 1
-1
0
1
x 2
-1 -0.5 0 0.5 101
2
yd
J
0 0.5 10
0.5
1Output vs Desired
Ya
Yd
0 0.5 1-5
0
5x 10
-3Error in output
Ya-Y
d
Yd
0 0.5 1
1
23Cost of outputs
J
Y
0 0.5 1
00.5
11.5
Cost difference
J-J(
Y)
Y-1 -0.5 0 0.5 1
-1
-0.5
0
0.5
1
1.5
x1
x 2
-1 0 1-1
0
1Output vs Desired
Ya
Yd
-1 0 1-5
0
5x 10
-3Error in output
Ya-Y
d
Yd
-1 0 10
1
2Cost of outputs
J
Y
-1 0 10
0.5
1
Cost difference
J-J(
Y)
Y-1 -0.5 0 0.5 1
-1
-0.5
0
0.5
1
1.5
x1
x 2
Monday, June 6th, 2011 CEC2011-#274
5 Dim, ill scaled example
• 5 different generators• Order of magnitude difference of gradient• Used exact NN to eliminate that source or error• Resulted in single cluster that balanced the incremental cost of all generators • 0.1% full range accuracy
Failure Methods
• `Kill distance’ may end other bifurcation branches• Cluster ends prematurely• Global optimization parameters insufficient• Corners in clusters can impair cubic interpolation Piecewise cubic can make interpolant monotonic• Difficult to verify in high dimensions Testing cluster is reasonably simple
12
Method Examples ConclusionIntroduction
-1 -0.5 0 0.5 1-1
-0.5
0
0.5
1
1.5Final Clusters: 11
x1
x 2
Possible bifurcations, direction dependent
Corners in cluster interfere with interpolation
Gradient goes to zero, ends cluster
Monday, June 6th, 2011 CEC2011-#274
Questions or Comments?
Global search, using particles & clusters, to find optimal, continuous output-inverse
functions._____________________________________________________________
Tested to work on many difficult combinations.
_____________________________________________________________
Future: apply to developmental/ resolution increasing control
13
Method Examples ConclusionIntroductionMonday, June 6th, 2011 CEC2011-#274