Amount of Information I Log (1/P)

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 2

INFORMATION THEORY Information theory may be understood as a mathematical theory of communication, or a theory of information transmission.

Communication system can be described by the diagram known as Shannons scheme. Information theory can be viewed as simply a branch of applied probability theory. It can also be viewed as a branch of ergodic theory, the theory of invariant transformations and transformations related to invariant transformations. Information theory is a branch of probability theory with extensive applications to communication systems. It was initiated by communication scientists who were studying the statistical structure of electrical communication equipment and was principally founded by Claude E. Shannon. The chief concern of information theory is to discover mathematical laws governing systems designed to communicate and manipulate information. Information theory has two primary goals: The first is development of fundamental theoretical limits on the achievable performance when communicating a given information source over a given communication channel using coding schemes. Second goal is the development of coding schemes that provide performance that is reasonably good in comparison with optimal performance given by the theory. WHAT IS INFORMATION Consider a communication system which transmits messages m 1, m2, m3 with probability P1, P2, P3 The amount of information transmitted through the message mk with probability Pk is given by:

Amount of information = Ik = log2[1/Pk]


PROPERTIES OF INFORMATION If there is more uncertainty about the message, information carried is also more. It receiver knows the message being transmitted; the amount of information carried is zero. If I1 is information carried by message m1 and I2 is the information carried by message m2, then amount of information carried combinely due to m1 and m2 is I1 + I2. N If there are M = 2 equally likely messages, then amount of information carried by each message will be N bits. Information is:
Knowledge derived from study, experience or instruction. Knowledge of a specific event or situation; intelligence. A collection of facts or data; statistical information. (In computer science) A nonaccidental signal or character used as input to a computer or communication system. A numerical measure of the uncertainty of an experimental outcome.

COMMUNICATION CHANNEL The communication system consists of transmitter, receiver and channel. Binary input is given to channel encoder. The channel encoder and modulator form a transmitter. Similarly demodulator and decoder form a receiver. The modulation channel or communication channel has analog signals through it or the signals through communication channel are analog. The signals through channel are corrupted due to noise. Hence errors are introduced in the data. Thus nature of channel and noise limit the maximum rate of data transfer. The transmitter, receiver and communication channel combinely form a data communication channel since the transmitted data is discrete, it is also called discrete channel.

INFORMATION RATE The information rate is represented by R, and given as:

R = rH
Here H is entropy or average information and r is rate at which messages are generated. CODE REDUNDANCY It is the measure of redundancy of bits in the encoded message sequence. It is given as:

Redundancy () = 1 Code efficiency


Redundancy should be as low as possible.

Coding theory Coding is the conversion of information to another form for some purpose.Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source. Data compression (source coding): There are two formulations for the compression problem: Lossless data compression: the data must be reconstructed exactly; Lossy data compression: allocates bits needed to reconstruct the data, within a specified fidelity level measured by a distortion function. This subset of Information theory is called ratedistortion theory. Error-correcting codes (channel coding): While data compression removes as much redundancy as possible, an error correcting code adds just the right kind of redundancy (i.e., error correction) needed to transmit the data efficiently and faithfully across a noisy channel. AVERAGE INFORMATION CONTENT (ENTROPY) Consider there are M different messages. Let these messages be m1, m2, m3 . mM and they have probabilities of occurrence P1, P2, P3 . PM. Suppose that a sequence of L messages have been transmitted. Then L is very large, then we may say that: P1 L messages of m1 are transmitted. P2 L messages of m2 are transmitted. P3 L messages of m3 are transmitted . . . . . . PM L messages of mM are transmitted. Hence the information due to message m1 will be:

I1= log2 {1/P1}


Since there are P1 L number of messages of m1, the total information due to all messages of m1 will be:

I1(total) = P1 L log2 {1/P1}


Similarly the information due to all messages of m2 will be:

I2(total) = P2 L log2{1/P2}
Thus the total information carried due to the sequence of L messages will be:

Itotal = I1(total) + I2(total) +.. + IM(total)


The average information per message will be:

Average information =
Average information is known as entropy and represented by (H).

Entropy = H = Entropy = H = P1log2 {1/P1} + P2log2{1/P2} + PMlog2{1/PM}


Or

H=
PROPERTIES OF ENTROPY Entropy is zero if the event is sure or impossible. i.e.

H=0

if

Pk = 0 or 1

When Pk = 1/M for all the M symbols, then the symbols are equally likely. For such source entropy is given as:

H = log2 M
Upper bound on entropy is given as:

Hmax = log2 M
CODE VARIANCE The variance is the measure of variability in codeword length. Variance should be as small as possible. The variance of code is given as:

2 =
L = Total number of symbols th Pk is probability of k symbol th nk is number of bits assigned to k symbol and is average codeword length.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy