ITC Question Bank-B Section

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

QUESTION BANK

Information theory and coding (22EC45)

1. Define i) Information content or self-information ii) Entropy iii) Information


rate iv) Source efficiency.
2. Derive an expression for average information content of symbols in long independent
sequences.
3. Find relationship between Hartleys, Nats and Bits.
4. Explain the properties of Entropy (prove).
5. A code is composed of dots and dashes. Assuming that a dash is 3 times as long as a dot
and has one third the probabilities of occurrence. Calculate i) the information in a dot
and dash ii) the entropy of dot-dash code. iii) The average rate of information if a dot
lasts for 10ms and this time is allowed between symbols.
6. A card is drawn from a deck.
i)You are told it is a spade. How much information did you receive?
ii) How much information did you receive if you are told that the card is an ace?
iii) If you are told that the card drawn is an ace of spade, how much information did you
receive?
iv) Is the information obatained in (iii) the sum of information obtained in (i) and (ii)?
7. A pair of dice are tossed simultaneously. The outcome of the first dice is recorded as x1 and that of
second dice as x2. Two events are defined as follows:
A= { ( x1, x2) such that x1+x2≤ 7}
B= { ( x1, x2) such thatx1>x2} which event conveys more information? Support your
answer by numerical computation.
8. A black and white TV picture consists of 525 lines of picture information. Assume that each line
consists of 525 picture elements (pixels) and that each element can have 256 brightness levels.
Picture are repeated at the rate of 30 frames/sec. calculate the average rate of information
conveyed by a TV set to a viewer.
9. A source emits one of four probable messages M1, M2, M3 and M4 with probabilities
of 7/16, 5/16, 1/8 𝑎𝑛𝑑 1/8 respectively. Find the entropy of the source. List all the
elements for the second extension of this source. Hence show that H (S2) =2 H(S).
10. In a facsimile transmission of a picture there are about 2.25x106 pixels/frame. For a good
reproduction 12 brightness levels are necessary. Assume all these levels are equally likely
to occur. Find the rate of information if one picture is to be transmitted every 3 minutes.
What is the source efficiency of this facsimile transmitter?
11. For the first order Markoff source shown in the figure i) Find the stationary distribution
ii) Find the entropy of each state and hence the entropy of the source.

12. For the Markov source shown below find i) the stationary distribution ii) Source entropy
iii) Find G1 & G2. Show that G1>G2>H(S).

13. For the Markoff source shown in figure i) Compute the state probabilities ii) Find the
entropy of each state iii) Entropy of the source.

14. Define i) Block code ii) Non-singular code iii) Uniquely decodable code
iv) Instantaneous code and v) Optimal code.
15. Consider the following code listed below. Identify the instantaneous codes and construct
their individual decision trees.

Source symbols Code-K Code-L Code-M Code-N


S1 0 0 0 00
S2 10 01 01 01
S3 110 001 011 10
S4 1110 0010 110 110
S5 1111 0011 111 111

16. Apply Shannon’s encoding algorithm to the following set of messages and obtain
code efficiency and redundancy.

m1 m2 m3 m4 m5

1/8 1/16 3/16 1/4 3/8

17. Construct binary code for the following source using Shannon’s binary encoding
procedure. S= {s1, s2, s3, s4, s5} and P= {0.4, 0.25, 0.15, 0.12, 0.08}.
18. The given messages x1, x2, x3, x4, x5 and x6 with respective probabilities 0.4, 0.2, 0.2,
0.1, 0.07 and 0.03, construct a binary code by applying Shannon-Fano encoding
procedure. Determine code efficiency and redundancy.
19. Given the messages x1, x2, x3, x4 and x5 with respective probabilities of 0.4, 0.2, 0.2,
0.1, 0.07 and 0.03, construct a binary code by applying Huffman encoding
procedure. Determine the efficiency and redundancy of the code so formed.
20. Explain the representation of a channel.
21. Mention the properties of Mutual Information
22. 4. A transmitter has an alphabet consisting of 5 letters {a1, a2, a3, a4, a5} and
the receiver has an alphabet of four letters {b1, b2, b3, b4, b5}. The joint
probabilities of the system are given below. Compute different entropies of this
channel.

𝟎. 𝟐𝟓 𝟎 𝟎 𝟎
𝟎. 𝟏𝟎 𝟎. 𝟑𝟎 𝟎 𝟎
𝑷 𝑨, 𝑩 = 𝟎 𝟎. 𝟎𝟓 𝟎. 𝟏𝟎 𝟎
𝟎 𝟎 𝟎. 𝟎𝟓 𝟎. 𝟏
𝟎 𝟎 𝟎. 𝟎𝟓 𝟎

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy