## Coding Theory MCQ

Coding Theory MCQ, Multiple Choice Questions on Coding Theory, Hamming Distance MCQ, Huffman Code MCQ, Parity Check MCQ, Shannon’s channel MCQ, Engineering MCQ, Analog Communication MCQ, Digital Communication MCQ

### Multiple-Choice Questions

Q.1. A parity check code can

- detect a single bit error
- correct a single bit error
- detect two-bit error
- correct two-bit error

**Answer: **detect a single bit error

Q.2. The redundancy of an (n, k) code is defined as

- \frac{n}{k}
- \frac{k}{n}
- \frac{n-k}{n}
- \frac{n-k}{k}

**Answer: **\frac{k}{n}

Q.3. The coding efficiency is given by

- 1- Redundancy
- 1 + Redundancy
- 1/Redundancy
- none

**Answer: **1- Redundancy

Q.4. The efficiency of the Huffman code is linearly proportional to

- average length of code
- maximum length of code
- average entropy
- none

**Answer: **average entropy

Q.5. The Hamming distance between the words 100101101 and 0110110010 is

- 4
- 5
- 6
- 7

**Answer:** 7

Q.6. The minimum distance of a code dictionary is 10. The code is capable of

- four-error correction
- four-error correction plus five-error detection
- three-error correction plus four-error detection
- five-error correction plus six-error detection

**Answer: **four-error correction plus five-error detection

Q.7. The total number of words in a single error-correction code dictionary of word length 15 is

- 1023
- 1024
- 2047
- 2048

**Answer: **2048

Q.8. Which of the following cannot be a member of the matrix P?

- 011
- 101
- 010
- 111

**Answer: **010

Q.9. The generator polynomial of a (7, 4) cyclic code is *g(x)= 1 + x + x ^{3}*. The code word for the message 1010 will be

- 1110010
- 1100010
- 1011001
- 0111001

**Answer: **1110010

Q.10. In a convolutional code encoder with a six-stage shift register, the number of modulo-2 adders is 4. For an input data stream of 5 bits, the code-word size will be

- 120
- 54
- 50
- 44

**Answer: **44

Q.11. The coding efficiency of a source code is

- \frac{H_{(x)}}{\bar{L}}
- H_{(x)}{\log M}
- \frac{H_{(x)}}{\bar{L}\log M}
- \frac{H_{(x)}\bar{L}}{\log M}

**Answer: **\frac{H_{(x)}}{\bar{L}\log M}

12. Let there be 11 messages to be transmitted. Huffman coding procedure is applied for M = 5. The number of terms to be combined in the first reduction is

- 2
- 3
- 4
- 5

**Answer:** 3

Q.13. In a single error-correcting Hamming code, the number of message bits in a block is 26. The number of check bits in the block would be

- 3
- 4
- 5
- 7

**Answer:** 5

Q.14. Consider a binary digital communication system with equally likely 0’s and 1 ‘s. When a binary 0 is transmitted, the detector input can lie between the levels – 0.25 V and + 0.25 V with equal probability. When binary 1 is transmitted, the voltage at the detector can have any value between 0 and 1 V with equal probability. If the detector has a threshold of 0.2 V (i.e., if the received signal is greater than 0.2 V, the bit is taken as 1), the average bit error probability is

- 0.15
- 0.2
- 0.05
- 0.5

**Answer: **0.15

Q.15. Which one of the following statements is correct? Shannon’s channel capacity formula indicates that in theory

- by using proper channel codes, we can get an error-free transmission on a noisy channel
- it is not possible to get an error-free transmission on a noisy channel, since there will always be some error in the detected signal for finite noise on any channel
- it is true only for some wired channels and not wireless channels
- it works only for analog signals and not for digital signals on any channel

**Answer:** it is not possible to get an error-free transmission on a noisy ……….

Q.16. Which one of the following is correct?

- Coding reduces the noise in the signal
- Coding deliberately introduces redundancy into messages
- Coding increases the information rate
- Coding increases the channel bandwidth

**Answer:** Coding increases the information rate

Q.17. During transmission over a certain binary communication channel, bit errors occur independently with probability p. The probability of at most one bit in error in a block of n bits is given by

- p^{n}
- 1-p^{n}
- np(1-p)^{n-1}+(1-p)^{n}
- 1-(1-p)^{n}

**Answer: **np(1-p)^{n-1}+(1-p)^{n}

Q.18. When a code is irreducible, it is also separable.

- True
- False

**Answer: **True

Q.19. Huffman code is also known as maximum redundancy code.

- True
- False

**Answer: **False

Q.20. The English language is not uniquely decipherable.

- True
- False

**Answer: **True

Q.21. In a binary system; the coding efficiency increases as P(0) approaches 0.5.

- True
- False

**Answer: **True

Q.22. A code dictionary with a minimum distance of 4 is capable of double error correction.

- True
- False

**Answer: **False

Q.23. A code dictionary with a minimum distance of 2 is not capable of error correction.

- True
- False

**Answer: **True

Q.24. Cyclic code is a subclass of convolutional code.

- True
- False

**Answer: **False

Q.25. The exhaustive search method of decoding a convolutional code is preferred over the sequential decoding method.

- True
- False

**Answer: **False