0% found this document useful (0 votes)
104 views

E2 201: Information Theory (2019) Homework 4: Instructor: Himanshu Tyagi

This document provides the reading assignment and homework questions for an information theory course. The reading assignment includes sections from two textbooks. The homework includes 13 questions ranging from finding entropy and coding lengths to analyzing universal hashing and types. Many questions analyze properties of variable length codes, entropy, and the relationship between coded sequence lengths and the underlying probability distributions.

Uploaded by

s2211r
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
104 views

E2 201: Information Theory (2019) Homework 4: Instructor: Himanshu Tyagi

This document provides the reading assignment and homework questions for an information theory course. The reading assignment includes sections from two textbooks. The homework includes 13 questions ranging from finding entropy and coding lengths to analyzing universal hashing and types. Many questions analyze properties of variable length codes, entropy, and the relationship between coded sequence lengths and the underlying probability distributions.

Uploaded by

s2211r
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

E2 201: Information Theory (2019)

Homework 4
Instructor: Himanshu Tyagi

Reading Assignment

• Read Chapter 5 of the Cover and Thomas book (first 10 sections) and Theorem 4.1 of the
Csiszár and Körner book. (I will update this list and add more resources)

Homework Questions Questions marked ∗ are more difficult.

1. Consider a source X with pmf

P(0) = 1/2, P(1) = P(2) = 1/8, P(3) = P(4) = P(5) = P(6) = 1/16.
p
a) Find L(X) and L (X) and give codes which achieve each of these quantities.
b) Find L0.25 (X) and a code which achieves it.
You can assume that the empty sequence ∅ is a valid codeword.
p p
2. Exhibit an example where L (X) > H(X). Compute the exact value of L (X) for your
example.

3. Consider the source X with pmf

P(1) = 1/3, P(2) = 1/6, P(3) = 1/6, P(4) = 1/3.

Find a Huffman code for this source.

4. For the pmf P of the previous question, consider the sequence of symbols 3142 produced
by a DMS with a common distribution P. Determine the output of the (infinite precision)
arithmetic code applied to this sequence. Apply the corresponding decoding algorithm to the
compressed sequence and verify the FIFO property.

5. Given random variables X and Y taking values in finite sets X and Y, respectively, and with
a joint pmf PXY , consider the following extension of variable-length source codes:
The encoder observes X and Y and compresses X as Z = e(X, Y ) ∈ {0, 1}∗ . The decoder
observes Y and the output Z of the encoder and forms an estimate X̂ = d(Y, Z). For every
(x, y) ∈ X × Y, denote by l(x, y) the length of the bit string e(x, y). The average length of
this code is given by
X
Le (X|Y ) = PXY (x, y) l(x, y).
x,y

A code is prefix-free if for every y ∈ Y the codeword set {e(x, y) : x ∈ X } is prefix-free.


p
Denote by L (X|Y ) the minimum average length of a prefix-free code.
a) Show that
p p
L (X|Y ) ≤ L (X).

1
b) Is the following statement true or false? If you say true, prove it. If you say false, provide
a counter example.
p p
L (X|Y ) = L (X) if and only if X and Y are independent.

6. Let X1 , ..., Xn be i.i.d. with a common (unknown) pmf P over a finite alphabet X . Show that
the maximum likelihood estimate of P upon observing X n = x is the type Px of x.

7. Let X1 , ..., Xn be i.i.d. with a common (unknown) pmf P over a finite alphabet X . What is
the conditional distribution of X n given the event X n ∈ TQ for some type Q ∈ T.

8. Let X1 , ..., Xn be iid with a common (unknown) pmf P over a finite alphabet X . Let Q denote
the type of the random sequence X n . Show that for every ε > 0

P (D(QkP) ≥ ε) ≤ (n + 1)|X |−1 2−nε .

Conclude that for X1 , ..., Xn generated iid Ber(p),



n
!
1 X 2
Xi − p ≥ t ≤ (n + 1)e−2nt .

P
n
i=1

9. Show that
1 k
· 2nh( n ) ≤ n
n+1
nh k
k≤2 ( n ) ,

where h denotes the binary entropy function.

10. For δ ∈ [0, 1/2], denote by Vn,δ the “volume of the n-dimensional Hamming sphere of radius
nδ,” i.e., the number of sequences in {0, 1}n with less than or equal to nδ 1s. Determine the
limit
1
lim log Vn,δ .
n→∞ n

11. Let X be a finite alphabet. For a sequence x ∈ X n , the profile Nx of the sequence x is given
by the vector Nx = (N1 , ..., Nn ) where Ni denotes the number of symbols that have appeared
i times. For instance, the profile of the sequence abcadabca is (1, 2, 1, 0..., 0). Denote by SN
the set of sequences in X n with a fixed profile N .
Show that the following fixed-length source code is universally rate optimal:
Given a fixed rate R > 0, let A denote the set of sequences x such that the profile Nx =
(N0 , ..., Nn ) satisfies
n  
X i n
R> Ni · log .
n i
i=0

The source code is given by simply mapping the sequences in A to binary sequences of a fixed
length nR (check that this can be done). All the sequences outside A are mapped to the all
0 sequence of the same length. The decoder simply returns the sequence in A corresponding
to the stored binary sequence.

2
12. For a pmf P on X , consider the probability assignment Q(n) for X n with

N (xi+1 |xi ) + 1
Q(xi+1 |xi ) = ,
i + |X |

where N (a|xi ) denotes the number of times a appears in xi . Show that for every P ∈ P(X )

D(P n kQ(n) ) ≤ |X | log(n + 1).

13. ∗ For a universal hash family F of mappings from X to {0, 1}k , let F be distributed uniformly
over F. Show that
h i 1p
EF d(PF (X) , unif({0, 1}k )) ≤ 2k−H2 (P ) ,
2
where H2 (P ) = − log x P (x)2 and unif({0, 1}k ) denotes the uniform distribution over k
P
bits.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy