Sei sulla pagina 1di 2

III Tutorial Sheet (2018 – 19)

EL-342

1. Given two information sources with alphabet X and Y. The joint probabilities of symbols from
these sources are given in the following table

y1 y2 y3

x1 0.1 0.08 0.13

x2 0.05 0.03 0.09

x3 0.05 0.12 0.14

x4 0.11 0.04 0.06

Find H(X), H(Y) and H(X|Y).


2. A DMS has a symbol alphabet with |X| =10. Find the upper and lower bounds of its entropy.
3. A source X, has an infinitely large set of outputs with probability of occurrence 2 -i, i=1,2,3,….
What is the entropy of the source?
4. A source is transmitting two symbols a and b with p(a)=1/16 and p(b)=15/16. Design a code which
would provide an efficiency of around (i) 50 % (ii) 70%.
5. Consider the channels A and B and the cascaded channel AB shown in Fig-1.

Fig – 1

(i) Find the channel capacity of channels A, B, and AB.


(ii) Explain the relationship among the capacities obtained in Part (i).
6. Calculate the amount of information needed to open a lock whose combination consists of three
integers, each ranging from 00 to 99.
7. A signal is bandlimited to 5 kHz sampled at a rate of 10000 samples/s, and quantized to 4 levels
such that the samples can take the values 0, 1, 2 and 3 with probabilities p, p, 0.5 – p and 0.5 – p
respectively. Find the capacity of the channel which has following channel transition probability
matrix.
1 0 0 0
p(Y|X) [1 1 0 0 ]
0 0 0.5 0.5
0 0 0.5 0.5
8. Consider a (127, 92) linear block code capable of triple error corrections.
(i) What is the probability of message error for an uncoded block of 92 bits if the channel
symbol error probability is 0.001?
(ii) What is the probability of message error when using (127, 92) block code for the same
channel?
9. Consider rate-1/2 convolutional encoder of Fig-2. Find the encoder output produced by the message
sequence 10111….

Fig – 2

10. Consider the following generator matrix over GF(2)


1 01 00
G  1 0 0 11
0 1 0 10
(i) Generate all possible codewords.
(ii) Find the check matrix H.
(iii) Find the generator matrix of an equivalent systematic code.
(iv) What is the minimum distance of the code?
(v) How many errors this code can detect? Write the set of error patterns this code can detect.
(vi) How many errors this code correct?
(vii) Is it a linear code?
(viii) What is the probability of symbol error with this encoding? Compare it with an uncoded
system.
11. Few codewords of a linear block code are listed as 11000, 01111, 11110,01010. List all codeword
of this code.
12. Prove that if sum of two error patterns is a valid codeword, then each error pattern has the same
syndrome.
13. Consider the (8, 4) extended Hamming code with parity check matrix

Check the number of errors in the following received words:10111011, 00101101, 11011010,
10000101.

Potrebbero piacerti anche