hw3 Pset
hw3 Pset
hw3 Pset
1
EX
.
Find the necessary and sucient condition for equality. Hint: Find an appropriate Q such that
RHS - LHS = D(P
X
||Q).
3. (Finiteness of entropy) A consequence of Exercise 2 is the following: For any N-valued random
variable X, EX < ensures that H(X) < . Next let us improve this result.
(a) Show that Elog X < H(X) < .
Moreover, show that the condition of X being integer-valued is not superuous by giving
a counterexample.
(b) Show that if k P
X
(k) is a decreasing sequence, then H(X) < Elog X < .
Moreover, show that the monotonicity of pmf is not superuous by giving a counterexample.
4. (Information radius v.s. diameter.) Let {P
Y |X=x
: x X} be a set of distributions. Prove that
sup
P
X
I(X; Y ) C sup
x,x
X
D(P
Y |X=x
||P
Y |X=x
)
Comment: This is the information-theoretic version of radius diameter.
5. Let S Binom(n, p). Prove that
H(S) =
1
2
log n +O(1),
where O(1) is uniform in 0 p 1.
1
Bonus: Prove that
H(S) =
1
2
log(2enp(1 p)) +O
1
n
,
where O(
1
n
) is uniform in 0 p 1.
Hint: Stirlings approximation:
e
1
12n+1
n!
2n(n/e)
n
e
1
12n
, n 1
6. Consider the conditional distribution P
Y
m
|X
: [0, 1] {0, 1}
m
, where given x [0, 1], Y
m
is
i.i.d. Bern(x). Dene the capacity:
C(m) max
P
X
I(X; Y
m
).
Our goal is to show that
C(m) =
1
2
log m+O(1).
(a) Let S =
n
i=1
Y
i
. Prove that
C(m) = max
P
X
I(X; S).
(b) Show that
C(m) = min
Q
S
sup
0x1
D(Binom(n, x)||Q
S
).
where Q
S
is a distribution on {0, . . . , m}. Hint: Use the minimax representation (Theorem
4.5). Why is the condition of the theorem satised?
(c) Choosing a uniform Q
S
and show that
C(m)
1
2
log m+O(1) , m .
(d) Choose a uniform P
X
and show that
C(m)
1
2
log m+O(1) , m .
Hint: For the last two parts Ex. 5 might be useful.
References
[1] T. Cover and J. Thomas, Elements of Information Theory, Second Edition, Wiley, 2006
2