0% found this document useful (0 votes)
5 views

MSML604_Homework_4

The document presents solutions to a homework assignment involving convex functions and optimization problems. It covers the convexity of various mathematical functions, including affine functions, eigenvalues, and budget constraints in investment scenarios. The analysis includes proving the convexity of sets and functions, along with formulating optimization problems with constraints.

Uploaded by

peeyu704
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

MSML604_Homework_4

The document presents solutions to a homework assignment involving convex functions and optimization problems. It covers the convexity of various mathematical functions, including affine functions, eigenvalues, and budget constraints in investment scenarios. The analysis includes proving the convexity of sets and functions, along with formulating optimization problems with constraints.

Uploaded by

peeyu704
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MSML604 Homework 4

Peeyush Dyavarashetty
March 2025

1 Question 1
1.1 Question 1.a
The domain x ∈ Rn is convex, since any line segment between x1 , x2 ∈ Rn i.e
θx1 + (1 − θ)x2 stays in R⋉ , where θ ∈ [0, 1]. So,

• g1i (x) = A(i) x − b(i) is convex because it is an Affine function.


• g2i (x) = ||A(i) x − b(i) ||p is convex because norm of convex function (affine
function here) is convex if p ≥ 1.
• f (x) = maxi∈[m] ||A(i) x − b(i) ||p = maxi∈[m] g2i (x) where g2i (x) is convex
∀ i ∈ [m], implying that f (x) is convex since from the condition pointwise
maximum and supremum.
Therefore, f (x) is convex.

1.2 Question 1.b


We know that sum of eigenvalues of a square matrix is its trace. Therefore,
n
X
f (X) = λi (X) = trace(X) (1)
i=1

Given, f : Sn → R, where Sn is a set of n × n symmetric matrices and to know


whether the f is a convex function

• We should prove that Sn is a convex set.


• ∀x, y ∈ domf, f (θx + (1 − θ)y) ≤ θf (x) + (1 − θ)f (y) ∀θ ∈ [0, 1]
Checking whether Sn is a convex set: Line between 2 matrices A, B ∈ Sn should
also be in Sn . Here, we know that A = AT and B = B T and for θ ∈ [0, 1],

(θA + (1 − θ)B)T = θAT + (1 − θ)B T = θA + (1 − θ)B

1
This proves that Sn is a convex set. Now, to prove for convex function, taking
A, B ∈ Sn and θ ∈ [0, 1]

f (A) = trace(A)
f (B) = trace(B)
f (θA + (1 − θ)B) = trace(θA + (1 − θ)B)
= θtrace(A) + (1 − θ)trace(B)
= θf (A) + (1 − θ)f (B)

Here, f (θx + (1 − θ)y) = θf (x) + (1 − θ)f (y) ∀θ ∈ [0, 1], proving that f is a
convex function.

1.3 Question 1.c


Here,

f (X) = λmin (X) = −λmax (−X) (2)

Since, X is symmetric, −X is symmetric. We know that λmax (M ) is a convex


function for M ∈ Sn and the negation of convex function is concave. Therefore,
here, f is a concave function.
Reason for λmax (M ) to be a convex function:

λmax (M ) = sup||v||=1 v T M v

Here, we can observe that λmax (M ) is linear w.r.t the matrix M . Therefore, it
is an affine function, which is convex.

2 Question 2
2.1 Question 2.a
Pn
Given total budget B = 1, which is i=1 Ii = 1, We need to

max E[RT ]
subject to V ar(RT ) ≤ σT2
Xn
Ii = 1
i=1

Since it is a maximization function, we can convert to

min − E[RT ]
subject to V ar(RT ) ≤ σT2
Xn
Ii = 1
i=1

2
The problem is that we do not know whether any P
of these minimization problems
n
or constraints are convex. Now, we know RT = i=1 Ri · Ii
n
X
E[RT ] = E[ Ri · Ii ]
i=1
n
X n
X
= E[Ri ]Ii = ri Ii
i=1 i=1

Here, E[RT ] is a convex function since it is an affine function.


n
X
V ar(RT ) = V ar( Ri · Ii )
i=1
n n
 2
 X  
X
=E Ri · Ii − E Ri · Ii
i=1 i=1
n
 X 2 
=E (Ri − ri ) · Ii
i=1
n
X n
X 
=E (Ri − ri )Ii (Rj − rj )Ij
i=1 j=1
n X
X n 
=E (Ri − ri )Ii (Rj − rj )Ij
i=1 j=1
X n
n X  
= E (Ri − ri )Ii (Rj − rj )Ij
i=1 j=1
Xn X n
= Σi,j Ii Ij = I T ΣI
i=1 j=1

 
I1
 I2 
where Σ is a covariance matrix and I =  . .
 
 .. 
In
Since the covariance matrix ∇2 V ar(RT ) = Σ is symmetric and positive semi-
definite, implying that the second derivative is positive. So V ar(RT ) − σT2 is
convex.
Since investment cannot be negative Ii ≥ 0 ∀i ∈ [n]. Therefore, the convex opti-
 T
mization for this problem, where I = I1 I2 . . . In , and Σ is covariance matrix

3
is
n
X
minI1 ,I2 ,...,In − ri Ii
i=1
subject to IΣI − σT2 ≤ 0
T

Xn
Ii − 1 = 0
i=1
− Ii ≤ 0

2.2 Question 2.b

min V ar(RT )
subject to E[RT ] ≥ Rmin
Xn
Ii − 1 = 0
i=1
− Ii ≤ 0
Now, we know that
min V ar(RT ) = IΣI T
n
X
subject to E[RT ] = ri Ii ≥ Rmin
i=1
n
X
Ii − 1 = 0
i=1
− Ii ≤ 0
where both equations are convex. Using these equations, we can convert the
following convex optimization problem to the following
minI IΣI T
n
X
min
subject to R − ri Ii ≤ 0
i=1
n
X
Ii − 1 = 0
i=1
− Ii ≤ 0

3 Question 3
Given sample information s1 , s2 , . . . sm ∈ Rn , we need to find non-negative
weights w = (w1 , w2 , . . . , wm ) ∈ Rm+ such that sum of the weights wi is 1.

4
Therefore, the conditions are
wi ≥ 0 ∀ i ∈ [m] =⇒ −wi ≤ 0 ∀i ∈ [m] (3)
Xm
wi = 1 (4)
i=1

Here, if wi > 1, then the equation 4 will not be satisfied. Therefore,


wi ≤ 1 ∀i ∈ [m] (5)
Pm
Therefore, the convex optimization problem, where s̄(w) = i=1 wi · si is
m
X
minw1 ,w2 ,...,wm ||si − s̄(w)||2
i=1
subject to − wi ≤ 0 ∀i ∈ [m]
wi − 1 ≤ 0 ∀i ∈ [m]
Xm
wi − 1 = 0
i=1

To prove this convex optimization problem is convex:


• The minimization function can be written as
Xm Xm Xm Xm m
X
si − wj · sj = wj · si − wj · sj
i=1 j=1 2 i=1 j=1 j=1 2
m
X m
X
= wj · (si − sj )
i=1 j=1 2

Xm Xm
= wj · (si − sj )
i=1 j=1 2

– Here, for a particular i, since fij (w1 , w2 , . . . , wm ) = si − sj is a con-


stant as they are given and is not a function of weights, they are
convex.
Pm Pm
– Now, for a particular i, j=1 wj ·(si −sj ) = j=1 wj ·fij (w1 , w2 , . . . , wm )
is convex because of non-negative weighted sum of convex functions
wj ∀j ∈ [m] as we know that wj ≥ 0. Since si ∈ Rm , si is convex.
– Norm of a convex function is a convex Pm function. Therefore, for a
particular i, fi (w1 , w2 , . . . , wm ) = || j=1 wj · (si − sj )|| is a convex
function.
Pm
– Sum of convex functions of i, which is i=1 fi (w1 , w2 , . . . , wm ) are
convex since non-negative weighted sum of convex functions.
• Inequality constraint functions are convex as they are affine.

• Equality function is affine, which is 1T w − 1, where 1 = 1 1


 T
...1

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy