Sei sulla pagina 1di 5

Name Date

INFORMATION THEORY
Answer the quiz questions

1. New abbreviation of Binary digit is represented by points: 1


(a) Binit
(b) bin
(c)
Digi
(d)bit

2. Unit of information is _______ points: 1


(a)Bytes
(b)
bytes/message
(c) bits
(d)
bit

3. In discrete memoryless source, the current letter produced by a source is


statistically independent of _____ points: 1
(a)Past
output
(b)Future
output
(c)Both a and
b
(d)None of the
above

4. Calculate the amount of information if p(k)=1/2 points: 1


(a)2bits
(b)3bits
(c)4bits
(d)none

1
5. M messages are equally likely and independent ,probability of occurrence of
each message will be points: 1
(a)1+M
(b)1=M
(c)1/M
(d)1>M

6. If there are M equally likely and independent messages ,then the amount of
information carried by each message will be points: 1
(a)N bits
(b) N+1
bits
(c) N-1 bits
(d) none

7. If the receiver knows the message being transmitted, the amount of


information carried is points: 1
(a)
1bit
(b) 0 bits
(c) infinity bits
(d) more bits

8. For M equally likely messages, the average amount of information H is


points: 1
(a)H=log M base
10
(b)H=log M base
20
(c)H=log M base
30
(d)H=log M base
2

2
9. Information rate is defined as points: 1
(a) Information per unit
time
(b) Average number of bits of information per
second
(c)
rH
(d) All of the
above

10. Entropy is points: 1


(a) Average information per
message
(b) Information in a
signal
(c) Amplitude of
signal
(d)All of the
above

11. Calculate ENTROPY for certain and most rare message points: 1
(a)1,0
(b)0,0
(c)1,1
(d)none

12. The expected information contained in a message is called points: 1


(a)Entropy
(b) Efficiency
(c)Coded
signal

(d)None of the
above

13. H(X,Y)= points: 1


(a) H(X)+H(Y/X)
(b) H(Y)+H(X/Y)
(c) Both a and
b
(d) H(X)+H(X,Y)

3
14. The relation between entropy and mutual information is points: 1
(a) I(X;Y) = H(X) - H(Y)
(b)I(X;Y) = H(X/Y) -
H(Y/X)
(c) I(X;Y) = H(X) -
H(X/Y)
(d)I(X;Y) = H(Y) - H(X)

15. The mutual information of the channel is points: 1


(a) Non-Symmetric,positive
(b)Symmetric,positive
(c)Both a and
b
(d)none

16. The mutual information is related to the joint entropy H(X,Y) is points: 1
(a) H(X,Y)
(b)H(X)+H(Y)-H(X,Y)
(c)H(X)+H(Y)+H(X,Y)
(d) None

17. Entropy is basically a measure of points: 1


(a) Rate of
information
(b) Average of
information
(c) Probability of
information
(d) Disorder of
information

18. The entropy of a discrete random variable X is defined by points: 1


(a) H(X) = ∑p(x) log
p(x)
(b) H(X) = -∑ log
p(x)
(c) H(X) = ∑ log
p(x)
(d) H(X) = -∑p(x) log
p(x)

4
19. Entropy, H = points: 1
(a)Total information / Number of
messages
(b)Total information >Number of
messages
(c)Total information <Number of
messages
(d)Total information =Number of
messages

20. The expression for code efficiency in terms of entropy. points: 1


(a)Code efficiency = Entropy(H) =Average code word
length(N)
(b)Code efficiency = Entropy(H) +Average code word
length(N)
(c)Code efficiency = Entropy(H) / Average code word
length(N)
(d)Code efficiency = Entropy(H) -Average code word
length(N)

Potrebbero piacerti anche