Upload
farzan-hmsm
View
2
Download
0
Embed Size (px)
DESCRIPTION
Optimization Lecture Notes
Citation preview
VR&D 1 1
DESIGN OPTIMIZATION CONSTRAINED MINIMIZATION III
Functions of N Variables
Ranjith Dissanayake
Structures LaboratoryDept. of Civil of Engineering
Faculty of EngineeringUniversity of Peradeniya
VR&D 2 2
• Find the Set of Design Variables that will
– Minimize F(X) Objective Function– Subject to;
Inequality Constraints
Equality Constraints
Side Constraints
CONSTRAINED MINIMIZATIONCONSTRAINED MINIMIZATION
( ) 0 1,jg X j M
( ) 0 1,kh X k L
1,L Ui i iX X X i N
VR&D 3 3
Is Feasible
Unrestricted in Sign, k=M+1, M+L
Kuhn-Tucker ConditionsKuhn-Tucker Conditions
*( )F X
*( ) 0j jg X
* * *
1 1( ) ( ) ( ) 0
M M Lj j k
j k MF X g X h X
0, 1,j j M
k
VR&D 5 5
• Find B and H to Minimize V = BHL• Subject to;
Problem StatementProblem Statement
700McI
32.54
3PLEI
12HB
1.0 B H
20.0 50H
VR&D 6 6
Design SpaceDesign Space
B
H
3 4 5 6 735
40
45
50
55
60 V = 20,000
H = 50
OPTIMUM
V = 15,000V = 10,000
V = 5,000
= 700
H/B = 12
VR&D 7 7
• Algorithm– Linearize the Objective and Constraints– Solve the Linear Approximation Using the Simplex
or Other Good Algorithm– Iterate Until Convergence to an Optimum
• Linearization
• Move Limits are Essential
Sequential Linear Programming Sequential Linear Programming (SLP)(SLP)
0 0( ) ( ) ( )TF X F X F X X
0 0( ) ( ) ( ) 1,Tj jjg X g X g X X j M
VR&D 8 8
One Stage of the ProcessOne Stage of the Process
B
H
3 4 5 6 735
40
45
50
55
60
TRUE OPTIMUM X0
APPROXIMATE OPTIMUM
VR&D 9 9
Why Move Limits are NeededWhy Move Limits are Needed
X0
LINEARIZED F
LINEARIZED g
g = 0
F = CONSTANT
MOVE LIMITSAPPROXIMATE OPTIMUMWITH MOVE LIMITS
TRUEOPTIMUM
X1
X2
VR&D 10 10
• Features– Easy to Program– Move Limits Must be Reduced as the Optimization
Progresses to Insure Solution for Those Cases Where There are Fewer Active Constraints Than Design Variables at the Optimum (Under Constrained)
– SLP is Not Considered to be a Good Method by the Theoreticians
– Experience has Shown that SLP is Powerful and Reliable if Coded with Care
– Good Method for Parallel Processor Applications• Does Not use a One-Dimensional Search
Sequential Linear ProgrammingSequential Linear Programming
VR&D 11 11
• Originally Developed by Zoutendijk in 1960
• Contained in the CONMIN and ADS Programs
• Has the Feature that it will Rapidly Find a Near Optimum Design
• Used for Inequality Constrained Problems Only
The Method of Feasible DirectionsThe Method of Feasible Directions
VR&D 12 12
1. Begin with an Initial Candidate Design, X0. Set the Iteration Counter, q = 0
2. Call the Analysis to Evaluate F(X) and gj(X), j=1, M
3. Set q = q + 1. Call the Sensitivity Analysis to Evaluate and where J is the set of Active and Violated Constraints
• gj (X) is active if gj(X) > CT (Typically CT = -0.05)• gj (X) is Violated if gj(X) > CTMIN
(Typically CTMIN = 0.001)
Optimization ProcessOptimization Process
( )F X ( ),jg X j J
VR&D 13 13
4. Calculate the Search Direction, Sq
5. Perform the One-Dimensional Search In Direction Sq
• This will Require Several Analyses
6. Check for Convergence to the Optimum. If Satisfied, Terminate. Else go to Step 3
Optimization Process – Cont.Optimization Process – Cont.
VR&D 14 14
Optimization Process FlowOptimization Process FlowINPUT X0
q = 0
CALCULATE F(X) AND gj(X), j = 1, M ANALYSIS
IDENTIFY ACTIVE AND VIOLATED CONSTRAINTS
CALCULATESEARCH DIRECTION, Sq
EXIT
PERFORM THEONE-DIMENSIONAL
SEARCH
SENSITIVITYANALYSIS
ANALYSISq = q + 1
SATISFIED?
CHECK FOR CONVERGENCETO THE OPTIMUM
YESNO
1q q qX X S
VR&D 15 15
• Constraint gj(X) is Considered Active if gj(X) > CT
– Initially, CT = -0.05 to “Trap” Almost Active Constraints• CT is Reduced During the Optimization Until CT = -CTMIN
• Constraint gj(X) is Considered Violated if gj(X) > CTMIN
– Usually, CTMIN = 0.001
Active Constraint StrategyActive Constraint Strategy
VR&D 16 16
Active Constraint StrategyActive Constraint Strategy
FEASIBLE
INFEASIBLE
X1
X2
( )jg X CT
( ) 0jg X ( )jg X CTMIN
( )jg X CT
( )jg X CTMIN
VR&D 17 17
• By First Forward Finite Difference
• Central Difference Gradients are More Reliable, but Twice as Expensive to Calculate
• If Analytic Gradients are Available, They Should Always be Used
Gradient (Sensitivity) CalculationsGradient (Sensitivity) Calculations
1
1
2
( ) ( )
( 2) ( )
( )
( ) ( )N
N
F X X F XX
F X X F XXF X
F X X F XX
VR&D 18 18
• If No Constraints are Active or Violated– If q = 1 use Steepest Descent Direction
– If q > 1 Use Fletcher-Reeves Conjugate Direction
• where
• Restart with Steepest Descent Every N Iterations or When Progress is Slow
Calculating Search Direction, SCalculating Search Direction, Sqq
1( )q qS F X
1 1( )q q qS F X S
21
22
( )
( )
q
q
F X
F X
VR&D 19 19
• If There are Active Constraints– Solve a Sub-Problem to Find the Components of Sq
and Value of that will– Maximize – Subject to;
Sq is Usable
Sq is Feasible
Sq is Bounded
– Where J is the Set of Active Constraints and j is Called the Push-Off Factor
Calculating Search Direction, SCalculating Search Direction, Sqq
1( ) 0q qF X S
1( ) 0q qj jg X S j J
1Tq qS S
VR&D 20 20
• As a Constraint Just Becomes Active, Allow the Search to Follow the Constraint
• As the Constraint Becomes More Active or Violated, Push Harder
• For gj(X) > CT
• Thus, j is a Quadratic Function of the Constraint Value
The Push-Off Factor The Push-Off Factor jj
21( )1
qj
jg X
CT
VR&D 21 21
• Note that
• And
• Where is the Angle Between the Two Vectors– Therefore, for S to be Both Usable and Feasible,
Must be Between 90O and 270O
• Solving for Sq is a Sub-Optimization Task– Details are Beyond the Scope of This Discussion
Calculating Search Direction, SCalculating Search Direction, Sqq
cosTF S F S
cosTj jg S g S
VR&D 22 22
The Effect of The Effect of j j on Son Sqq
F = CONSTANT
g1 = 0
g 2 = 0
X1
X2
F
1g
S
0S
1S
VR&D 23 23
• If There are Violated Constraints– Solve a Sub-Problem to Find the Components of Sq
and Value of that will– Minimize Sq is Usable– Subject to;
Sq is Feasible
Sq is Bounded
– Where J is the Set of Active Constraints, j is the Push-Off Factor and is a Large Positive Number
Calculating Search Direction, SCalculating Search Direction, Sqq
1( ) 0q qjg X S j J
2 1Tq qS S
( )TF X S
VR&D 24 24
Search Direction at Different Points Search Direction at Different Points in the Design Spacein the Design Space
FEASIBLE
INFEASIBLE
X1
X2
Sq
Sq
Sq
OPTIMUM
F(X) = Constant
F
F
F
jg
jg
0jg
VR&D 25 25
The Search ProcessThe Search Process
B
H
3 4 5 6 735
40
45
50
55
60 V = 20,000
H = 50
V = 15,000V = 10,000
V = 5,000
= 700
H/B = 12
X0
S1
S2
VR&D 26 26
• Very Similar to the Method of Feasible Directions
• Also Very Similar to the Generalized Reduced Gradient Method
– Does not Push Away From Active Constraints• Follows Curved Constraints
• This Method is Used by the DOT Optimizer from VR&D
Modified Method of Feasible Modified Method of Feasible DirectionsDirections
VR&D 27 27
• If There are Active Constraints– Solve a Sub-Problem to Find the Components of Sq
that will– Minimize Sq is Usable– Subject to;
Sq is Feasible
Sq is Bounded
– Where J is the Set of Active Constraints
Calculating Search Direction, SCalculating Search Direction, Sqq
1( ) 0q qjg X S j J
1Tq qS S
1( )q T qF X S
VR&D 28 28
• If There are Violated Constraints– Solve a Sub-Problem to Find the Components of Sq
and Value of that will– Minimize Sq is Usable– Subject to;
Sq is Feasible
Sq is Bounded
– Where J is the Set of Active Constraints, j is the Push-Off Factor and is a Large Positive Number
Calculating Search Direction, SCalculating Search Direction, Sqq
1( ) 0q qjg X S j J
2 1Tq qS S
( )TF X S
VR&D 29 29
Search Direction at Different Points Search Direction at Different Points in the Design Spacein the Design Space
FEASIBLE
INFEASIBLE
X1
X2
Sq
Sq
Sq
OPTIMUM
F(X) = Constant
F
F
F
jg
jg
0jg
VR&D 30 30
• Following Curved (Nonlinear) Constraints
The One-Dimensional SearchThe One-Dimensional Search
g = 0
g
1S
1X
2X 2 1 S
VR&D 31 31
• Following Curved (Nonlinear) Constraints• Move Parallel to the Constraint Gradient Back
to the Constraint Boundary– Minimize– Subject to;
– This is a Simple Sub-Problem
The One-Dimensional SearchThe One-Dimensional Search
TX X
( ) ( ) 0Tj jg X S g X X
VR&D 32 32
The Search ProcessThe Search Process
B
H
3 4 5 6 735
40
45
50
55
60 V = 20,000
H = 50
V = 15,000V = 10,000
V = 5,000
= 700
H/B = 12
X0
S1
S2
S3
VR&D 33 33
• Features– Rapidly Finds a Near Optimum Design– Deals With Equality Constraints by Using Two Equal
and Opposite Inequality Constraints– Usually Obtains More Precise Constraint Satisfaction
than the Method of Feasible Directions• Due to the Constant Newton-Raphson Iterations Back to the
Constraint Boundaries– Widely Used in the DOT Optimizer
Modified Method of Feasible Modified Method of Feasible DirectionsDirections
VR&D 34 34
• Basic Concept– Create a Quadratic Approximation to the
Lagrangian– Create Linear Approximations to the Constraints– Solve the Quadratic Problem for the Search
Direction, S– Perform the One-Dimensional Search with Penalty
Functions to Avoid Constraint Violations
Sequential Quadratic Programming Sequential Quadratic Programming (SQP)(SQP)
VR&D 35 35
• Minimize• Subject to;
• Where, Typically, = 0.9 if the Constraint is Violated and 1.0 Otherwise is Used to Overcome Constraint Violations
The Search Direction, SThe Search Direction, Sqq
1( ) ( ) ( )2
T TQ S F X F X S S BS
( ) ( ) 0 1,Tj jg X S g X j M
( ) ( ) 0 1,Tk kh X S h X k L
VR&D 36 36
• Minimize the Exterior Penalty Function
• Where j are the Lagrange Multipliers from the Quadratic Sub-Problem and R is a Large Positive Constant
The One-Dimensional SearchThe One-Dimensional Search
21 1
( ) ( ) 0, ( ) ( )M L
j k Mj k
P X S F X S Max g X S R h X S
VR&D 37 37
• Initially set B to the Identity Matrix, I• Update B Using the BFGS Algorithm
• Where
The Hessian Matrix, BThe Hessian Matrix, B
T TNew
T TBpp BB Bp Bp p
1 (1 )q qp X X y Bp 1( , ) ( , 1)q q q q
x xy P X P X
1.0 0.2T TIf p y p Bp
0.8 0.2T
T TT T
p Bp If p y p Bpp Bp p y
VR&D 38 38
1. Initialize B = I2. Calculate Gradients of the Objective and all
Constraints3. Solve the Quadratic Programming Sub-Problem4. Calculate the Lagrange Multipliers5. Search Using the Exterior Penalty Function6. Update the B Matrix7. Check for Convergence. If Satisfied, Exit. Else
Repeat from Step 2
The AlgorithmThe Algorithm
VR&D 39 39
The Search ProcessThe Search Process
B
H
3 4 5 6 735
40
45
50
55
60 V = 20,000
H = 50
V = 15,000V = 10,000
V = 5,000
= 700
H/B = 12
X0
S1
S2
VR&D 40 40
• Features– Strong Theoretical Basis in the Kuhn-Tucker
Conditions– Considered Best by the Theoreticians– May cut off the Feasible Region
• Modifications Required– Several Modifications Have Been Made to Improve
Reliability for Engineering Problems
Sequential Quadratic ProgrammingSequential Quadratic Programming
VR&D 41 41
• Termination Criteria– Maximum Number of Iterations, ITMAX
• Any Iterative Process Must Have this Test– Satisfaction of the Kuhn-Tucker Conditions
• No Usable-Feasible Search Direction can be Found– Diminishing Returns
• Absolute Change in the Objective for ITRM Iterations
• Relative Change in the Objective for ITRM Iterations
– No Feasible Solution can be Found
Testing For ConvergenceTesting For Convergence
1( ) ( )q qF X F X DABOBJ
1
1
( ) ( )
( )
q q
q
F X F XDELOBJ
F X
VR&D 42 42
• Ten Variable Tapered Beam
ExampleExample
P = 50,000 NT
L = 500 cm
B
HCROSSSECTION
E = 200 GPa
< 14,000 Nt/cm2
< 2.54 cm
1 2 3 4 5
VR&D 43 43
• Method1. Augmented Lagrange Multiplier Method (ALM)2. Sequential Linear Programming (SLP)3. Method of Feasible Directions (MFD)4. Modified Method of Feasible Directions (MMFD)5. Sequential Quadratic Programming (SQP)
Optimization ResultsOptimization Results
VR&D 44 44
Optimization ResultsOptimization Results
Method Optimum IterationsFunction
Evaluations1 65,678 8 533
2 65,398 10 110
3 65,411 11 140
4 65,399 11 170
5 65,410 8 106
VR&D 45 45
• Useful for > 90% of Everyday Design Tasks• Approach Used by VisualDOC
– Read Input and Write Output From/To ASCII Files– Use VisualScript to Couple Your Code with
VisualDOC– Identify the Design Variables, Objective and
Constraints– Perform Design Study
• Gradient or Non-Gradient Based Optimization• Response Surface Optimization• Design of Experiments
– Post-Process to Study Design Changes
Black Box OptimizationBlack Box Optimization
VR&D 46 46
• “Black Box” Optimization is Useful for Many Everyday Design Tasks
• No Special Knowledge is Needed to use Modern Optimization Tools
– Some Theoretical Understanding Helps to Make Effective Use of Optimization
• The Optimum Found is Only as Reliable as the Design Criteria and Analysis
Numerical Optimization is the Most Powerful Design Assistance Tool Available Today
Summary of General OptimizationSummary of General Optimization