Sei sulla pagina 1di 2

A.S.Rao Dept. of Electronics & Communication Engg.

Balaji Institute of Engineering & Sciences



Tutorial Sheet 5
Subject: Digital Communication Branch: III ECE II sem

1. The probabilities of 4 messages are , , 1/8 and 1/16. Calculate the Entropy. If the source
produces 100 Symbols/sec, determine the maximum information rate.
2. Using Shanon-Fano algorithm find the code words for six messages occurring with probabilities 1/3,
1/3, 1/6, 1/12, 1/24 and 1/24.
3. Calculate the capacity of a standard 4 KHz telephone channel working in the range of 300 to 3400
Hz with a SNR of 32 dB.
4. Let
1
,
2
,
3
and
4
have probabilities , , 1/8 and 1/8.
i) Calculate H
ii) Find R if r=1 messages per second.
iii) What is the rate at which binary digits are transmitted if the signal is sent after encoding
1

4

as 00, 01, 10 and 11.
5. A transmitter has an alphabet consisting of four letters [ F, G, H and I ]. The joint probabilities for
the communication are given below:

F G H I
A 0.25 0.00 0.00 0.00
B 0.10 0.30 0.00 0.00
C 0.00 0.05 0.10 0.00
D 0.00 0.00 0.05 0.10
E 0.00 0.00 0.05 0.00

Determine the Marginal, Conditional and Joint entropies for this channel.
6. Construct the Optimum Source Code for the following symbols.

X X
1
X
2
X
3
X
4
X
5
X
6

P(x) 0.3 0.25 0.2 0.1 0.1 0.05

Find the coding efficiency.
7. An independent, discrete source transmits letters selected from an alphabet consisting of 3 letters
A, B and C with respective probabilities 0.7, 0.2 and 0.1. if consecutive letters are statistically
independent and two symbol words are transmitted, find the entropy of the system of such words.
8. The noise characteristic of a channel is given as:
Y
X
(
(
(

6 . 0 2 . 0 2 . 0
2 . 0 6 . 0 2 . 0
2 . 0 2 . 0 6 . 0
and P(x
1
)=1/8, P(x
2
)=1/8 and P(x
3
)=6/8.
Find the mutual information.
9. A source is transmitting messages Q
1
, Q
2
and Q
3
with probabilities P
1
, P
2
and P
3
. Find the condition
for the entropy of the source to be maximum.
10. Consider an AWGN channel with 4 KHz bandwidth and the noise PSD is . / 10
2
12
Hz W

=
q
the
signal power required at the receiver is 0.1 mw. Find the capacity of the channel.
A.S.Rao Dept. of Electronics & Communication Engg.

Balaji Institute of Engineering & Sciences

11. For the communication system whose channel characteristic is:




Given P(x=0), P(x=1) = 1/2. Verify that I[X, Y] = I[Y, X]
12. For a signal x(t) which is having a uniform density function in the range 0 x 4, find H(X). If the
same signal is amplified by a factor of 8, determine H(X).
13. Calculate the average information content in English language, assuming that each of the 26
characters in the alphabet occurs with equal probability.
14. A discrete memory less source X has four symbols X
1
, X
2
, X
3
, and X
4
with probabilities 0.4, 0.3, 0.2
and 0.1 respectively. Find the amount of information contained in the messages X
1
X
2
X
1
X
3
and X
4

X
3
X
3
X
2
.
15. Two binary channels are connected in cascade as shown.




a) Find the overall matrix of the resultant channel.
b) Find P(z
1
) and P(z
2
) when P(x
1
)=P(x
2
)=0.5
16. An analog signal having a 4 KHz bandwidth is sampled at 1.25 times the Nyquist rate and each
sample is quantized into one of 256 equally likely levels. Assume that the successive samples are
statistically independent.
i) What is the information rate of this source?
ii) Find the bandwidth required for the channel for the transmission of the output of
the source, if the S/N ratio is 20 dB if error free transmission is required.
17. A discrete memory less source has five equally likely symbols. Construct a Shanon-Fano code for
the above source.
18. Two dice are thrown and we are told that the sum of faces is 7. Find the information content of
above message.





1/2
1
1/4
1/4
0
1
2
0
1
X
Y

Potrebbero piacerti anche