Loading...

Loading, please wait...

Back to Topics

Communication System
Information Theory and Error Control coding

Practice questions from Information Theory and Error Control coding.

6
Total
0
Attempted
0%
0
Correct
0%
0
Incorrect
0%
Q#1 Information Theory and Error Control coding GATE EC 2025 (Set 1) MCQ +1 mark -0.33 marks

Consider an additive white Gaussian noise (AWGN) channel with bandwidth  and noise power spectral density . Let  denote the average transmit power constraint.

Which one of the following plots illustrates the dependence of the channel capacity  on the bandwidth  (keeping  and  fixed)?

Explanation:

By Shanon's Channel Capacity Theorem,

Noise Power  (double side band PSD) .

 

And given  No are constant,  

 constant

Putting (ii) & (iii) in (i)

Taking

 limit (note the limit)

By L-Hopital's rule  constant

Since only (A) matches the limits, (A) is the correct graph

Q#2 Information Theory and Error Control coding GATE EC 2025 (Set 1) NAT +1 mark -0 marks

The generator matrix of a  binary linear block code is given by

 

The minimum Hamming distance  between codewords equals _________ (answer in integer).

Explanation:

Given code (n,k) = (6,3),

Where n total bits, k message bits

Corresponding codeword

d ranges from 000 to 111

C values by matrix multiplication

smallest homing weight (no. of non-zero bits) of non-zero codeword

=minimum Hamming Distance  see column

Q#3 Information Theory and Error Control coding GATE EC 2025 (Set 1) NAT +2 marks -0 marks

 and  are Bernoulli random variables taking values in . The joint probability mass function of the random variables is given by:

 

 

 

 

The mutual information  is _________ (rounded off to two decimal places).

Explanation:

The mutual information I  between two random variables  and  is defined as:

 

where  Joint Probability Mass function: of X and Y.

, Marginal Probability of  

 Marginal Probability of

Given:

 

 

 

Marginal Probabilities  and  :

The marginal probability of  and  :

 

 

 

 

The marginal probability of  and  :

 

 

 

 

 

 

 

 

 

 

= 0+0+0+0

= 0

Alternate Solution:

 are Bernauli Random variables,

they are independent.

Thus,

Q#4 Information Theory and Error Control coding GATE EC 2024 (Set 1) NAT +1 mark -0 marks

A source transmits symbols from an alphabet of size 16. The value of maximum achievable entropy (in bits) is _________.

Explanation:

Q#5 Information Theory and Error Control coding GATE EC 2024 (Set 1) MCQ +2 marks -0.66 marks

The information bit sequence  is to be transmitted by encoding with Cyclic Redundancy Check 4 (CRC-4) code, for which the generator polynomial is . The encoded sequence of bits is ___________.

Explanation:

Given information bit sequence:

Generator Polynomial  

Maximum power of generator polynomial is 4.

So append 4 zeros in d

d=11101 01 01 0000

Replace last 0000 in d with 1100

Encoded sequence = 11101 01 0111 00

Q#6 Information Theory and Error Control coding GATE EC 2023 (Set 1) NAT +2 marks -0 marks

The frequency of occurrence of 8 symbols (a-h) is shown in the table below. A symbol is chosen and it is determined by asking a series of "yes/no" questions which are assumed to be truthfully answered. The average number of questions when asked in the most efficient sequence, to determine the chosen symbol, is _______ (rounded off to two decimal places).

Explanation:

The average number of questions when asked in the most efficient sequence, to determine the chosen symbol  possible number of questions per symbol