Experiment No. 6: Channel Capacity
Experiment No. 6: Channel Capacity
Experiment No. 6: Channel Capacity
Experiment No. 6
AIM: Experimental verification of the following using MATLAB.
➢ Shannon capacity theorem
➢ Channel capacity calculations for binary symmetric channel, cascading of
BSC
C=B*log2 (1+S/N)
Here C is the maximum capacity of the channel in bits/second otherwise called Shannon’s capacity
limit for the given channel, B is the bandwidth of the channel in Hertz, S is the signal power in Watts
and N is the noise power, also in Watts. The ratio S/N is called Signal to Noise Ratio (SNR). It can be
ascertained that the maximum rate at which we can transmit the information without any error, is
limited by the bandwidth, the signal level, and the noise level. It tells how many bits can be transmitted
per second without errors over a channel of bandwidth B Hz, when the signal power is limited to S
Watts and is exposed to Gaussian White Noise of N Watts of additive nature.
Source code :-
clc;
clear all;
close all;
B=1:0.001:100;
S=1;
N0=1;
N=N0*B;
C = B.*log2(1 + S./N);
subplot(2,1,1);
plot(B,C);
title("constant snr");
xlabel('Bandwidth');
ylabel('capacity');
MUSHTAQ AHMAD DAR 2019PEE0032
B=10;
S=1:0.01:500;
N0=1;
N=N0*B;
snr=S/N;
C = B.*log2(1 +snr );
subplot(2,1,2);
plot(snr,C);
title("constant Bandwidth");
xlabel('snr');
ylabel('capacity');
Results: -
Source Code: -
clc;
clear all;
close all;
a(1)=0;
for j=2:length(p)
a(j)=a(j-1)+p(j-1);
end
fprintf(' A Matrix');
display(a);
for i=1:length(p)
n(i)= ceil(-1*(log2(p(i))));
end
fprintf(' Code length matrix');
display(n);
for i=1:length(p)
b=a(i);
for j=1:n(i)
f=b*2;
c=floor(f);
f=f-c;
z=[z c];
b=f;
end
fprintf('Codeword %d',i);
display(z);
z=[];
end
MUSHTAQ AHMAD DAR 2019PEE0032
Output:
probability 1-0.15
probability 2-0.05
probability 3-0.25
probability 4-0.15
probability 5-0.4
p1 = Columns 1 through 5
A Matrix
Columns 1 through 5
n= 2 2 3 3 5
Codeword 1-0 0
Codeword 2- 0 1
Codeword 3-1 0 1
Codeword 4- 1 1 0
Codeword 5- 1 1 1 1 0
clc;
clear all;
close all;
% Entropy of coin
p=0:0.01:1
h=-p.*log2(p)-(1-p).*log2(1-p);
subplot(221);
plot(h);
title('entropy coin');
xlabel('probability');
ylabel('H(x)');
m=1-h;
subplot(223);
MUSHTAQ AHMAD DAR 2019PEE0032
plot(m);
title('BSC');
xlabel('probability');
ylabel('H(x)');
Result: