Digital Communication Chapter 3
Digital Communication Chapter 3
Digital Communication Chapter 3
Chapter 3
Information Theory and Coding
Information Source
•Information refers to any new knowledge about something.
•The role of a communication-system designer to make sure that the information is
transmitted to the receiver correctly.
•An information source conducts random experiments producing some events.
•The outcome of the events is picked from a sample space (set of possible
outcomes) of the experiment
•The sample space is also called as source alphabet.
•Each element of the source alphabet is called as symbol having some probability
of occurrence
Discreate Memoryless Source (DMS)
• A source is discreate if the number of symbols present in the alphabet is
discreate/ finite.
• A source is memoryless if the occurrence of one event is independent of
occurrence of previous events.
Ex 1: Binary symmetric source:
• It is described by the alphabet . In this source, there are two symbols,
and
• Corresponding probabilities:
•
Measure of Information
•Information is a measure of uncertainty of an outcome
Ans:
• So
• So
• So
• So
Average Information (Entropy)
•When an information source is transmitting a long sequence of symbols,
Then average information transmitted from the source is more interesting than
information contained in individual symbols.
bits/symbol
Ex: Find the entropy of the DMS of the previous problem.
Ans:
= 1.75 bits/symbols
•The average information contained in one symbol transmitted by the source is 1.75bits
•Range of H(S):
For ,
Information Rate (R)
•If the source S is emitting r symbols per second, then the information rate of the symbol is
given by
(Unit: )
Ex: Find the information rate of the previous problem if the source emits
1000symbols/sec.
Ans:
• In each signaling interval, the channel accepts an input signal from source
alphabet (X)
• In response, it generates a symbols from the destination alphabet (Y).
• If the channel output depends only on present input of the channel, then
channel is memoryless.
Channel Transition Probability
• Each input-output path is represented by its
channel transition probability.
• Transition probability between jth source
symbol and ith destination symbol is the
conditional probability .
•Ans
So, and
Conditional Entropy
•Let the symbol is observed at destination point.
•The symbol is caused due to transmission of any symbol from the input source alphabet.
• is the uncertainty about the channel input after the channel output is observed.
Mutual Information
•It is the difference of uncertainty of the transmitting symbol before and after observing
the channel output.
•: Uncertainty about the channel input before the channel output is observed
• : Uncertainty about the channel input after the channel output is observed
•It is the uncertainty about the transmitted symbol is being resolved by observing the
received symbol .
Channel Capacity
•It is defined as the maximum value of the mutual information.
bits/symbol
bits/sec
•Channel-coding theorem
•It is used at receiver to detect and correct the errors in the received information.
•Receiver need not to ask any additional information from the transmitter.
•Code rate:
.
• So,
• : Generator Matrix ()
In calculation use
modulo-2 adder:
1+1=0
1+0=1
0+1=1
0+0=0
Find all codewords for all possible data words
How many bits of error it can detect?
Data word Code word
111 111000
Minimum Hamming distance
110 110110
101 101011
So
100 100101
011 011101 Hence, the code can detect single bit error
010 010010
001 001110
000 000000
Cyclic Codes:
•It is a subclass of linear block codes.
• Data polynomial:
•Ans:
Data: 1010. So
Code polynomial: