Lecture 01 - Intro
Lecture 01 - Intro
1
Course Information
! Instructors:
– dr. ir. Sundeep Chepuri
– Prof. dr. Geert Leus
! Class schedule:
– Tuesdays between 13.45-15.45 at Lecture hall H
– Fridays between 8:45-10:30 DW-IZ 140
2
2
Course Information
! Book(s) are freely available online
– Stephen Boyd and Lieven Vandenberghe, ”Convex Optimization”,
Cambridge University Press, 2004.
– Stephen Boyd et al. ”Distributed optimization and statistical
learning via the alternating direction method of multipliers.”
Foundations and Trends in Machine Learning 3.1 (2011): 1-122.
– Slides/lecture notes for first-order methods.
! Assessment
– Open-book written exam
– Compulsory lab assignment worth 1 EC (20%); report and short
presentation.
! Class schedule:
– Tuesdays between 13.45-15.45 at Lecture hall H
– Fridays between 10:45-12:30 DW-IZ 140
3 3
Mathematical optimization
minimize f0(x)
subject to fi(x) ≤ bi, i = 1, . . . , m
• f0 : Rn → R: objective function
• fi : Rn → R, i = 1, . . . , m: constraint functions
4
Array processing
50◦
|y(θ)|
❅
❅ θtar = 30◦
❘
❅
10◦
sidelobe level
❅
❘
❅
2
!
minimize i |y(θi )|
subject to y(θtar ) = 1
(1)
(Euclidean) distance between hyperplanes
H1 = {z | aT z + b = 1}
aT yi + b ≤ −1, i = 1, . . . , M
H2 = {z | aT z + b = −1}
aT xi + b ≥ 1, i = 1, . . . , N
st linear discrimination
a QP in a, b
(1/2)∥a∥2
+ b = −1}
minimize (1/2)∥a∥2
+ b = 1}
subject to aT xi + b ≥ 1, i = 1, . . . , N (1)
aT yi + b ≤ −1, i = 1, . . . , M
6
2
Examples
portfolio optimization
• variables: amounts invested in different assets
• constraints: budget, max./min. investment per asset, minimum return
• objective: overall risk or return variance
data fitting
• variables: model parameters
• constraints: prior information, parameter limits
• objective: measure of misfit or prediction error
7
Solving optimization problems
• least-squares problems
• linear programming problems
• convex optimization problems
8
Least-squares
using least-squares
minimize cT x
subject to aTi x ≤ bi, i = 1, . . . , m
minimize f0(x)
subject to fi(x) ≤ bi, i = 1, . . . , m
if α + β = 1, α ≥ 0, β ≥ 0
11
solving convex optimization problems
• no analytical solution
• reliable and efficient algorithms
• computation time (roughly) proportional to max{n3, n2m, F }, where F
is cost of evaluating fi’s and their first and second derivatives
• almost a technology
12
Example
rkj
θkj
illumination Ik
3
h(u)
0
0 1 2 3 4
u
• answer: with (1), still easy to solve; with (2), extremely difficult
• moral: (untrained) intuition doesn’t always work; without the proper
background very easy problems can appear quite similar to very difficult
problems
16
Nonlinear optimization
Goals
1. recognize and formulate problems (such as the illumination problem,
classification, etc.) as convex optimization problems
2. Use optimization tools (CVX, YALMIP, etc.) as a part the lab
assignment.
3. characterize optimal solution (optimal power distribution), give
limits of performance, etc.
Topics
1. Background and optimization basics;
2. Convex sets and functions;
3. Canonical convex optimization problems (LP, QP, SDP);
4. Second-order methods (unconstrained and constrained optimization);
5. First-order methods (gradient, subgradient, conjugate gradient);
6. Alternating Direction Method of Multipliers.
18 4
The case of a convex cost function
f0 (x⋆ ) ≤ f0 (x), ∀x ∈ Rn .
6
19
Nonlinear optimization