0% found this document useful (0 votes)
123 views

ADC - MCQ-Unit-4-ASSIGNMENT - Answers

This document contains a unit on source and error control coding that is divided into two parts. Part A contains 10 multiple choice questions testing basic understanding of topics like parity check coding, Hamming distance, channel capacity, information rate, entropy, and Shannon-Hartley theorem. Part B contains 6 multiple choice analytical questions involving calculations of entropy, average code length, determining generator matrices, error detection capabilities, and encoding messages. The document provides an overview of key concepts in source and error control coding through these practice test questions.

Uploaded by

Amirtha Rajan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
123 views

ADC - MCQ-Unit-4-ASSIGNMENT - Answers

This document contains a unit on source and error control coding that is divided into two parts. Part A contains 10 multiple choice questions testing basic understanding of topics like parity check coding, Hamming distance, channel capacity, information rate, entropy, and Shannon-Hartley theorem. Part B contains 6 multiple choice analytical questions involving calculations of entropy, average code length, determining generator matrices, error detection capabilities, and encoding messages. The document provides an overview of key concepts in source and error control coding through these practice test questions.

Uploaded by

Amirtha Rajan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Unit – 4 – Source & Error Control Coding

Part – A (Remember/Understand Level) – 1 Mark Questions

1. Parity check bit coding is used for


a) Error correction
b) Error detection
c) Error correction and detection
d) None of the above

2. For hamming distance dmin and t errors in the received word, the condition to be able to
correct the errors is
a) 2t + 1 ≤ dmin
b) 2t + 2 ≤ dmin
c) 2t + 1 ≤ 2dmin
d) Both a and b

3. The Capacity of Gaussian channel is


a) C = 2B(1+S/N) bits/s
b) C = B2(1+S/N) bits/s
c) C = B(1+S/N) bits/s
d) C = B(1+S/N)2 bits/s

4. For M equally likely messages, the average amount of information H is


a) H = log10M
b) H = log2M
c)
H = log10M2
d) H = 2log10M

5. Information rate is defined as


a) Information per unit time
b) Average number of bits of information per second
c) rH
d) All of the above

6. The expected information contained in a message is called


a) Entropy
b) Efficiency
c) Coded signal
d) None of the above

7. For a (7, 4) block code, 7 is the total number of bits and 4 is the number of
a) Information bits
b) Redundant bits
c) Total bits- information bits
d) None of the above

8. The technique that may be used to increase average information per bit is
a) Digital modulation techniques
b) ASK
c) FSK
d) Shannon-Fano algorithm
9. Entropy is
a) Amplitude of signal
b) Information in a signal
c) All of the above
d) Average information per message

10. According to Shannon Hartley theorem,


a) The channel capacity becomes infinite with infinite bandwidth
b) The channel capacity does not become infinite with infinite bandwidth
c) Has a tradeoff between bandwidth and Signal to noise ratio
d) Both b and c are correct

Part – B (Analytical Level) – 2 Mark Questions

1. Assume that the symbols (S1,S2,S3,S4,S5,S6) with the probabilities


(0.4,0.3,0.02,0.15,0.1,0.03) and Calculate Entropy.
a) 3.056bits/symbol
b) 1.056bits/symbol
c) 4.056bits/symbol
d) 2.057bits/symbol

2. Find the Average code length for the probabilities (0.4,0.2,0.1,0.2,0.1) with the code words
(0,01,1110,101,1111).
a) 2.72b/s
b) 2.32b/s
c) 2.2b/s
d) 3.7b/s

3. Assume (7, 4) linear code has the following matrix as a generator matrix. If u = (1 1 0 1) is the
message to be encoded, then calculate code word.

a) 1100100
b) 0001101
c) 1001100
d) 0001100

4. Consider a (6, 3) linear block code defined by the generator matrix. Determine if the code is
Hamming code or Not.

a) Given code is Hamming code


b) Given code is not Hamming code
5. Assume (7, 4) linear code has the following matrix as a generator matrix with minimum
hamming distance is 3 then how many errors can the code detect.

a) 2
b) 3
c) 1
d) 0

6. A (7, 4) Cyclic code, a generator polynomial is g(x) = x3+x+1 .Fine code word for the
message m=1001.
a) 0111001
b) 0111010
c) 0110010
d) 0101100

7. In Go back N-ARQ, if frames 4, 5 & 6 are received successfully, the receiver may send an
ACK ___ to the sender.
a) 5
b) 6
c) 7
d) Any of the above

8. Hamming distance between two code vectors X = (101) and Y= (110) is,
a) 2
b) 1
c) 0
d) None of the above

9. Adding 1 and 1 in modulo-2 arithmetic results in


a) 2
b) 1
c) 0
d) None of the above

10. If the Hamming distance between a data word and the corresponding codeword is three, there
are __________ bits in error.
a) 3
b) 4
c) 5
d) None of the above

******************************************************************************

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy