Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Quiz I
Instructor: B. Sainath, 2210−A, Electrical and Electronics Eng. Dept., BITS Pilani.
Course: Coding theory and practice (EEE G612)
DATE: Aug. 27th , 2019 TIME: 40 Mins. Max. points: 10
Note: Note: There is no negative marking. There is no partial marking for just writing down a right formula.
Answer all the subparts of a question at one place. Provide critical steps in each solution. Overwritten answers
will not be rechecked. Highlight your final answer in a rectangular box. Use the same notation given in data.
Q. 1. Determine the probability density function pY (y) that maximizes the differential entropy for given mean
µ. Hint: Use the method of Lagrange multipliers. Note that there are two constraints. [2.5 points]
Determine the differential entropy of random variable with the probability distribution derived above. [1 point]
Y
0 1
X
1 1
0 3 3
1
1 0 3
Fig. 1. Pertaining to Q. 2.
Compute the following: i). H(Y ). ii). H(Y |X). iii). I(X; Y ). [2.5 points]
I. SOLUTIONS
Q. 1. After solving the constrained optimization problem using the method of Lagrange multipliers, we get the
required probability density function (pdf). The pdf is given by
1 y
pY (y) = exp − , y ≥ 0.
µ µ
Note that Y is an exponential random variable. [2.5 points]
The differential entropy of the exponential random variable Y is equal to log (µe). [1 point]
Q. 2. i). H(Y ) = 0.92 bits. ii). H(Y |X) = 0.67 bits. iii). I(X; Y ) = H(Y )−H(Y |X) = 0.25 bits. [2.5 points] .
Q. 3. i). No. The code is not instantaneous. Because the first code word is the prefix of the second code word.
ii). Yes. The code is uniquely decodable. Given a sequence of code words, first isolate occurrences of 01 and
then parse the remaining into 0’s. [1 + 1 points]
Q. 4. i). True. This unique probability distribution is called the stationary distribution of the ergodic Markov
information source/process. This is because the stationary distribution does not depend upon the initial distribution
with which the states are chosen, it can be determined from the conditional symbol probabilities.
ii). False. The joint entropy is defined as
X X
H (Y, X) = − p(y, x) log p(y, x),
y∈X X∈Y
= −E [log p(Y, X)] .
[1 + 1 points]