ITC Question Bank-B Section
ITC Question Bank-B Section
ITC Question Bank-B Section
12. For the Markov source shown below find i) the stationary distribution ii) Source entropy
iii) Find G1 & G2. Show that G1>G2>H(S).
13. For the Markoff source shown in figure i) Compute the state probabilities ii) Find the
entropy of each state iii) Entropy of the source.
14. Define i) Block code ii) Non-singular code iii) Uniquely decodable code
iv) Instantaneous code and v) Optimal code.
15. Consider the following code listed below. Identify the instantaneous codes and construct
their individual decision trees.
16. Apply Shannon’s encoding algorithm to the following set of messages and obtain
code efficiency and redundancy.
m1 m2 m3 m4 m5
17. Construct binary code for the following source using Shannon’s binary encoding
procedure. S= {s1, s2, s3, s4, s5} and P= {0.4, 0.25, 0.15, 0.12, 0.08}.
18. The given messages x1, x2, x3, x4, x5 and x6 with respective probabilities 0.4, 0.2, 0.2,
0.1, 0.07 and 0.03, construct a binary code by applying Shannon-Fano encoding
procedure. Determine code efficiency and redundancy.
19. Given the messages x1, x2, x3, x4 and x5 with respective probabilities of 0.4, 0.2, 0.2,
0.1, 0.07 and 0.03, construct a binary code by applying Huffman encoding
procedure. Determine the efficiency and redundancy of the code so formed.
20. Explain the representation of a channel.
21. Mention the properties of Mutual Information
22. 4. A transmitter has an alphabet consisting of 5 letters {a1, a2, a3, a4, a5} and
the receiver has an alphabet of four letters {b1, b2, b3, b4, b5}. The joint
probabilities of the system are given below. Compute different entropies of this
channel.
𝟎. 𝟐𝟓 𝟎 𝟎 𝟎
𝟎. 𝟏𝟎 𝟎. 𝟑𝟎 𝟎 𝟎
𝑷 𝑨, 𝑩 = 𝟎 𝟎. 𝟎𝟓 𝟎. 𝟏𝟎 𝟎
𝟎 𝟎 𝟎. 𝟎𝟓 𝟎. 𝟏
𝟎 𝟎 𝟎. 𝟎𝟓 𝟎