Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
Lecture 1: Introduction to Convex Optimization
Gan Zheng
University of Luxembourg
SnT Course: Convex Optimization with Applications
Fall 2012
Outline
• Motivation
• Examples
• Course schedule
• Course goals
• Further reading
Acknowledgement: Thanks Prof. Stephen Boyd (Stanford), and Wing-KinMa (CUHK) for the course materials.
1 of 20
Motivation
• Optimization: finding the maxima or the minima of functions, subject topossible constraints.
• Interdisciplinary applications
– Electrical circuits– Wireless network– Economics– Transportation
• Examples:
– Device sizing in electronic design, which is the task of choosing the widthand length of each device in an electronic circuit.
– Portfolio optimization, to seek the best way to invest some capital in aset of n assets.
– Travelling salesman problem, given a list of cities, to find the shortestpossible route that visits each city once and returns to the origin city.
2 of 20
Mathematical optimization
(mathematical) optimization problem
minimize f0(x)subject to fi(x) ≤ bi, i = 1, . . . , m
• x = (x1, . . . , xn): optimization variables
• f0 : Rn → R: objective function
• fi : Rn → R, i = 1, . . . , m: constraint functions
optimal solution x⋆ has smallest value of f0 among all vectors thatsatisfy the constraints
Introduction 1–2
Solving optimization problems
general optimization problem
• very difficult to solve
• methods involve some compromise, e.g., very long computation time, ornot always finding the solution
exceptions: certain problem classes can be solved efficiently and reliably
• least-squares problems
• linear programming problems
• convex optimization problems
Introduction 1–4
Least-squares
minimize ‖Ax− b‖2
2
solving least-squares problems
• analytical solution: x⋆ = (ATA)−1AT b
• reliable and efficient algorithms and software
• computation time proportional to n2k (A ∈ Rk×n); less if structured
• a mature technology
using least-squares
• least-squares problems are easy to recognize
• a few standard techniques increase flexibility (e.g., including weights,adding regularization terms)
Introduction 1–5
Linear programming
minimize cTxsubject to aT
i x ≤ bi, i = 1, . . . ,m
solving linear programs
• no analytical formula for solution
• reliable and efficient algorithms and software
• computation time proportional to n2m if m ≥ n; less with structure
• a mature technology
using linear programming
• not as easy to recognize as least-squares problems
• a few standard tricks used to convert problems into linear programs(e.g., problems involving ℓ1- or ℓ∞-norms, piecewise-linear functions)
Introduction 1–6
Convex optimization problem
minimize f0(x)subject to fi(x) ≤ bi, i = 1, . . . , m
• objective and constraint functions are convex:
fi(αx + βy) ≤ αfi(x) + βfi(y)
if α + β = 1, α ≥ 0, β ≥ 0
• includes least-squares problems and linear programs as special cases
Introduction 1–7
solving convex optimization problems
• no analytical solution
• reliable and efficient algorithms
• computation time (roughly) proportional to max{n3, n2m,F}, where Fis cost of evaluating fi’s and their first and second derivatives
• almost a technology
using convex optimization
• often difficult to recognize
• many tricks for transforming problems into convex form
• surprisingly many problems can be solved via convex optimization
Introduction 1–8
Two examples [1]
max
nXi=1
xi (1)
s.t. x2i − xi = 0, i = 1, · · · , n; xixj = 0, ∀(i, j) ∈ Γ.
Complexity: no. of variables = 256, 2n = 1077 = ∞
maxx
x0 (2)
s.t λmin
0BBB@26664
x1 xl1
. . . · · ·xm xl
m
xl1 · · · xl
m x0
377751CCCA ≥ 0, l = 1, · · · , k,
mXj=1
ajxlj = b
l, l = 1, · · · , k,
kXi=1
xi = 1. (3)
Complexity: for k=6, m=100, no. of variables = km + m + 1 = 701 ≈ 2.7 × 256, can be
solved in a few tens of seconds! Reason: (1) is non-convex while (2) is convex.
9 of 20
Example: Diet Problem
• xi is the quantity of food i
• Each unit of food i has a cost of ci
• One unit of food j contains an amount aij of nutrient i.
• We want nutrient i to be at least equal to bi.
• Problem: find the cheapest diet such that the minimum nutrient requirementsare fulfilled.
This problem can be formulated as
minx�0
cTx
s.t. Ax � b.
10 of 20
Example
m lamps illuminating n (small, flat) patches
lamp power pj
illumination Ik
rkj
θkj
intensity Ik at patch k depends linearly on lamp powers pj:
Ik =
m∑
j=1
akjpj, akj = r−2
kj max{cos θkj, 0}
problem: achieve desired illumination Ides with bounded lamp powers
minimize maxk=1,...,n | log Ik − log Ides|subject to 0 ≤ pj ≤ pmax, j = 1, . . . ,m
Introduction 1–9
how to solve?
1. use uniform power: pj = p, vary p
2. use least-squares:
minimize∑n
k=1(Ik − Ides)
2
round pj if pj > pmax or pj < 0
3. use weighted least-squares:
minimize∑n
k=1(Ik − Ides)
2 +∑m
j=1wj(pj − pmax/2)2
iteratively adjust weights wj until 0 ≤ pj ≤ pmax
4. use linear programming:
minimize maxk=1,...,n |Ik − Ides|subject to 0 ≤ pj ≤ pmax, j = 1, . . . ,m
which can be solved via linear programming
of course these are approximate (suboptimal) ‘solutions’
Introduction 1–10
0 0.5 1 1.5 2 2.5 3 3.5 40
1
2
3
4
5
6
7
8
9
x
Original cost functionLeast−squares approximationLinear approximation
Figure 1: Original cost function and approximations
13 of 20
5. use convex optimization: problem is equivalent to
minimize f0(p) = maxk=1,...,n h(Ik/Ides)subject to 0 ≤ pj ≤ pmax, j = 1, . . . ,m
with h(u) = max{u, 1/u}
0 1 2 3 40
1
2
3
4
5
u
h(u
)
f0 is convex because maximum of convex functions is convex
exact solution obtained with effort ≈ modest factor × least-squares effort
Introduction 1–11
additional constraints: does adding 1 or 2 below complicate the problem?
1. no more than half of total power is in any 10 lamps
2. no more than half of the lamps are on (pj > 0)
• answer: with (1), still easy to solve; with (2), extremely difficult
• moral: (untrained) intuition doesn’t always work; without the properbackground very easy problems can appear quite similar to very difficultproblems
Introduction 1–12
Nonlinear optimization
traditional techniques for general nonconvex problems involve compromises
local optimization methods (nonlinear programming)
• find a point that minimizes f0 among feasible points near it
• fast, can handle large problems
• require initial guess
• provide no information about distance to (global) optimum
global optimization methods
• find the (global) solution
• worst-case complexity grows exponentially with problem size
these algorithms are often based on solving convex subproblems
Introduction 1–14
Course Goals
• understand fundamental theory of convex optimization
• characterize and formulate linear, quadratic, geometric, second-order cone,and semidefinite programming problems
• use optimization software to solve the structured convex problems numerically
• identify the hidden convexity of problems
• apply approximation methods to tackle non-convex problems.
17 of 20
Course Schedule• 18 Oct, Lecture 1: Introduction
Lecture 2: Convex Sets and Convex Functions
• 25 Oct, Lecture 3: Convex Optimization and Lagrange Duality
• 2 Nov, Lecture 4: Linear and Quadratic Programming
Lecture 5: Disciplined Convex Programming and CVX
• 9 Nov, Lecture 6: Second-Order Cone Programming
• 15 Nov, Lecture 7: Semidefinite Programming
• 22 Nov, Lecture 8: Geometric Programming
• 29 Nov, Lecture 9: Interior Point Methods, Guest lecture by Florin
• 5 Dec, Lecture 10: More Application Examples and Summary
18 of 20
Reading assignments and homework
• Relevant tutorial papers will be posted before class, together with lecture slides
• Four sets of homework
Feedback
• Too fast, too slow
• Presentation, English ...
19 of 20
Reference
1 Convex Optimization, S. Boyd and L. Vandenberghe, Cambridge UniversityPress, 2004.
2 Lectures on Modern Convex Optimization: Analysis, Algorithms andEngineering Applications, Aharon Ben-Tal , Arkadi Nemirovski, Society forIndustrial Mathematics, 2001.
3 EE364a: Convex Optimization I, Stanford University
http://www.stanford.edu/class/ee364a/
20 of 20