0% found this document useful (0 votes)
25 views

Lecture 01 - Intro

This document provides information about the course ET4350 Applied Convex Optimization. It lists the instructors, teaching assistant, class schedule, required textbooks, and assessment details. The class will cover topics including mathematical optimization problems, least squares problems, linear programming, and convex optimization problems. Standard solution methods will be discussed for each problem class. Examples of applications that can be formulated as optimization problems are also provided.

Uploaded by

billie quant
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

Lecture 01 - Intro

This document provides information about the course ET4350 Applied Convex Optimization. It lists the instructors, teaching assistant, class schedule, required textbooks, and assessment details. The class will cover topics including mathematical optimization problems, least squares problems, linear programming, and convex optimization problems. Standard solution methods will be discussed for each problem class. Examples of applications that can be formulated as optimization problems are also provided.

Uploaded by

billie quant
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

ET4350

Applied Convex Op4miza4on

1
Course Information

! Instructors:
– dr. ir. Sundeep Chepuri
– Prof. dr. Geert Leus

! Teaching assistant (excercise sessions)


– Seyran Khademi

! Class schedule:
– Tuesdays between 13.45-15.45 at Lecture hall H
– Fridays between 8:45-10:30 DW-IZ 140

2
2
Course Information
! Book(s) are freely available online
– Stephen Boyd and Lieven Vandenberghe, ”Convex Optimization”,
Cambridge University Press, 2004.
– Stephen Boyd et al. ”Distributed optimization and statistical
learning via the alternating direction method of multipliers.”
Foundations and Trends in Machine Learning 3.1 (2011): 1-122.
– Slides/lecture notes for first-order methods.

! Assessment
– Open-book written exam
– Compulsory lab assignment worth 1 EC (20%); report and short
presentation.

! Class schedule:
– Tuesdays between 13.45-15.45 at Lecture hall H
– Fridays between 10:45-12:30 DW-IZ 140
3 3
Mathematical optimization

(mathematical) optimization problem

minimize f0(x)
subject to fi(x) ≤ bi, i = 1, . . . , m

• x = (x1, . . . , xn): optimization variables

• f0 : Rn → R: objective function

• fi : Rn → R, i = 1, . . . , m: constraint functions

optimal solution x⋆ has smallest value of f0 among all vectors that


satisfy the constraints

4
Array processing
50◦
|y(θ)|

❅ θtar = 30◦

10◦
sidelobe level


Sidelobe level minimization


18
make |y(θ)| small for |θ − θtar | > α

(θtar : target direction; 2α: beamwidth)

via least-squares (discretize angles)

2
!
minimize i |y(θi )|
subject to y(θtar ) = 1

(sum is over angles outside beam)


5
Machine learning

Robust linear discrimination

(1)
(Euclidean) distance between hyperplanes

H1 = {z | aT z + b = 1}

aT yi + b ≤ −1, i = 1, . . . , M
H2 = {z | aT z + b = −1}

aT xi + b ≥ 1, i = 1, . . . , N
st linear discrimination

oints by maximum margin,


is dist(H1, H2) = 2/∥a∥2

to separate two sets of points by maximum margin,


een hyperplanes

a QP in a, b
(1/2)∥a∥2
+ b = −1}

minimize (1/2)∥a∥2
+ b = 1}

subject to aT xi + b ≥ 1, i = 1, . . . , N (1)
aT yi + b ≤ −1, i = 1, . . . , M
6
2
Examples
portfolio optimization
• variables: amounts invested in different assets
• constraints: budget, max./min. investment per asset, minimum return
• objective: overall risk or return variance

device sizing in electronic circuits


• variables: device widths and lengths
• constraints: manufacturing limits, timing requirements, maximum area
• objective: power consumption

data fitting
• variables: model parameters
• constraints: prior information, parameter limits
• objective: measure of misfit or prediction error
7
Solving optimization problems

general optimization problem

• very difficult to solve


• methods involve some compromise, e.g., very long computation time, or
not always finding the solution

exceptions: certain problem classes can be solved efficiently and reliably

• least-squares problems
• linear programming problems
• convex optimization problems

8
Least-squares

minimize ∥Ax − b∥22

solving least-squares problems

• analytical solution: x⋆ = (AT A)−1AT b


• reliable and efficient algorithms and software
• computation time proportional to n2k (A ∈ Rk×n); less if structured
• a mature technology

using least-squares

• least-squares problems are easy to recognize


• a few standard techniques increase flexibility (e.g., including weights,
adding regularization terms)
9
Linear programming

minimize cT x
subject to aTi x ≤ bi, i = 1, . . . , m

solving linear programs


• no analytical formula for solution
• reliable and efficient algorithms and software
• computation time proportional to n2m if m ≥ n; less with structure
• a mature technology

using linear programming


• not as easy to recognize as least-squares problems
• a few standard tricks used to convert problems into linear programs
(e.g., problems involving ℓ1- or ℓ∞-norms, piecewise-linear functions)
10
Convex optimization problem

minimize f0(x)
subject to fi(x) ≤ bi, i = 1, . . . , m

• objective and constraint functions are convex:

fi(αx + βy) ≤ αfi(x) + βfi(y)

if α + β = 1, α ≥ 0, β ≥ 0

• includes least-squares problems and linear programs as special cases

11
solving convex optimization problems

• no analytical solution
• reliable and efficient algorithms
• computation time (roughly) proportional to max{n3, n2m, F }, where F
is cost of evaluating fi’s and their first and second derivatives
• almost a technology

using convex optimization

• often difficult to recognize


• many tricks for transforming problems into convex form
• surprisingly many problems can be solved via convex optimization

12
Example

m lamps illuminating n (small, flat) patches


lamp power pj

rkj
θkj

illumination Ik

intensity Ik at patch k depends linearly on lamp powers pj :


m
!
−2
Ik = akj pj , akj = rkj max{cos θkj , 0}
j=1

problem: achieve desired illumination Ides with bounded lamp powers

minimize maxk=1,...,n | log Ik − log Ides|


subject to 0 ≤ pj ≤ pmax, j = 1, . . . , m
13
how to solve?
1. use uniform power: pj = p, vary p
2. use least-squares:
!n
minimize k=1 (Ik − Ides)2

round pj if pj > pmax or pj < 0


3. use weighted least-squares:
!n 2
!m 2
minimize k=1 (I k − I des ) + j=1 w j (p j − p max /2)

iteratively adjust weights wj until 0 ≤ pj ≤ pmax


4. use linear programming:
minimize maxk=1,...,n |Ik − Ides|
subject to 0 ≤ pj ≤ pmax, j = 1, . . . , m

which can be solved via linear programming

of course these are approximate (suboptimal) ‘solutions’


14
5. use convex optimization: problem is equivalent to

minimize f0(p) = maxk=1,...,n h(Ik /Ides)


subject to 0 ≤ pj ≤ pmax, j = 1, . . . , m

with h(u) = max{u, 1/u}


5

3
h(u)

0
0 1 2 3 4
u

f0 is convex because maximum of convex functions is convex

exact solution obtained with effort ≈ modest factor × least-squares effort


15
additional constraints: does adding 1 or 2 below complicate the problem?

1. no more than half of total power is in any 10 lamps


2. no more than half of the lamps are on (pj > 0)

• answer: with (1), still easy to solve; with (2), extremely difficult
• moral: (untrained) intuition doesn’t always work; without the proper
background very easy problems can appear quite similar to very difficult
problems

16
Nonlinear optimization

traditional techniques for general nonconvex problems involve compromises

local optimization methods (nonlinear programming)


• find a point that minimizes f0 among feasible points near it
• fast, can handle large problems
• require initial guess
• provide no information about distance to (global) optimum

global optimization methods


• find the (global) solution
• worst-case complexity grows exponentially with problem size

these algorithms are often based on solving convex subproblems


17
Course goals and topics

Goals
1. recognize and formulate problems (such as the illumination problem,
classification, etc.) as convex optimization problems
2. Use optimization tools (CVX, YALMIP, etc.) as a part the lab
assignment.
3. characterize optimal solution (optimal power distribution), give
limits of performance, etc.
Topics
1. Background and optimization basics;
2. Convex sets and functions;
3. Canonical convex optimization problems (LP, QP, SDP);
4. Second-order methods (unconstrained and constrained optimization);
5. First-order methods (gradient, subgradient, conjugate gradient);
6. Alternating Direction Method of Multipliers.
18 4
The case of a convex cost function

Local minima: x⋆ is an unconstrained local minimum of f0 : Rn !→ R if


is no worse than its neighbors.

f0 (x⋆ ) ≤ f0 (x), ∀x ∈ Rn with ∥x − x⋆ ∥ < ϵ.

Global minima: x⋆ is an unconstrained local minimum of f0 : Rn !→ R if


is no worse than all other vectors.

f0 (x⋆ ) ≤ f0 (x), ∀x ∈ Rn .

When the function is convex every local minimum is also global.

6
19
Nonlinear optimization

traditional techniques for general nonconvex problems involve compromises

local optimization methods (nonlinear programming)


• find a point that minimizes f0 among feasible points near it
• fast, can handle large problems
• require initial guess
• provide no information about distance to (global) optimum

global optimization methods


• find the (global) solution
• worst-case complexity grows exponentially with problem size

these algorithms are often based on solving convex subproblems


20

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy