31
Linear Programming Chapter 6.14-7.3 Björn Morén

Linear Programming - Chapter 6.14-7

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Linear Programming - Chapter 6.14-7

Linear ProgrammingChapter 6.14-7.3

Björn Morén

Page 2: Linear Programming - Chapter 6.14-7

1 Simplex Method with Upper BoundsOptimality Conditions

Simplex Method

2 Interior Point Methods

Page 3: Linear Programming - Chapter 6.14-7

1 Simplex Method with Upper BoundsOptimality Conditions

Simplex Method

2 Interior Point Methods

Page 4: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 3 / 22

Problem Formulation

1. Upper bounds as constraints

2. Consider upper bounds implicitly

Page 5: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 4 / 22

Basic Solution

Divide variables if they are basic, at lower bound, or at

upper bound (xB, xL, xU )

Page 6: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 5 / 22

Dual Problem

Page 7: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 6 / 22

Optimality Conditions and Dual Feasibility

Reduced costs are defined as cj = πA,j

Page 8: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 6 / 22

Optimality Conditions and Dual Feasibility

Reduced costs are defined as cj = πA,j

Page 9: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 7 / 22

Updated Simplex Method

1. Check optimality conditions

2. Select entering variable

3. Minimum ratio test (and select exiting variable)

4. Do pivot step

Page 10: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 8 / 22

Updated Simplex Method

1. Check optimality conditions

2. Select entering variable

3. Minimum ratio test (and select exiting

variable)

4. Do pivot step

Page 11: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 9 / 22

Optimality Conditions

Solution is optimal if

1. For all xj at lower bound, xj = lj reduced cost

cj ≥ 0

2. For all xj at upper bound, xj = uj reduced cost

cj ≤ 0

Page 12: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 10 / 22

Minimum Ratio

If the entering variable xs is at lower bound:

1. xs = θ

2. xB = g − θA,s

Page 13: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 11 / 22

Minimum Ratio

If the entering variable xs is at upper bound:

1. xs = -θ

2. xB = g+θA,s

Page 14: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 12 / 22

Phase I

To find a feasible solution, a standard Phase I method

can be used.

Page 15: Linear Programming - Chapter 6.14-7

1 Simplex Method with Upper BoundsOptimality Conditions

Simplex Method

2 Interior Point Methods

Page 16: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 14 / 22

History

Dikin 1967 was first.

Khachiyan 1979, ellipsoid algorithm, solves Linear

Programs in polynomial time.

Work by Karmakar 1984 made it popular. Also solves

Linear Programs in polynomial time, more effective

than the ellipsoid algorithm.

Page 17: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 15 / 22

Interior Point

x interior point if

Page 18: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 15 / 22

Interior Point

x interior point if

Page 19: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 16 / 22

Relative Interior Point

x relative interior point if

Page 20: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 16 / 22

Relative Interior Point

x relative interior point if

Page 21: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 17 / 22

Phase I - examples

Find a solution from the following Phase I problem

Page 22: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 17 / 22

Phase I - examples

Find a solution from the following Phase I problem

Page 23: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 18 / 22

Phase I - examples

Find a solution from the following Phase I problem

Page 24: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 18 / 22

Phase I - examples

Find a solution from the following Phase I problem

Page 25: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 19 / 22

Phase I - examples

Find a solution from the following Phase I problem

Page 26: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 19 / 22

Phase I - examples

Find a solution from the following Phase I problem

Page 27: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 20 / 22

Rounding Procedure

Theorem 1 (7.1). if x is sufficiently close to the

optimal objective function value. An optimal basic

feasible solution can be found by using the purification

routine.

Page 28: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 21 / 22

General Algorithm

1. Find search direction

1. Solving approximation problems e.g affine scalingmethod, Karmarkars projective scaling method

2. Solving a problem with optimality conditions,nonlinear equations, e.g primal-dual pathfollowing IPM

2. Determine step length

Page 29: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 21 / 22

General Algorithm

1. Find search direction

1. Solving approximation problems e.g affine scalingmethod, Karmarkars projective scaling method

2. Solving a problem with optimality conditions,nonlinear equations, e.g primal-dual pathfollowing IPM

2. Determine step length

Page 30: Linear Programming - Chapter 6.14-7

Linear Programming Björn Morén December 15, 2016 22 / 22

Observations

• Number of iterations grow very slowly with

problem size, almost constant

• Time per iteration increases with problem size

• Extra steps required to get a BFS (purification

routine)

• Useful on large scale problems

• Simplex method is good to start with for

understanding

Page 31: Linear Programming - Chapter 6.14-7

Question?www.liu.se