Sei sulla pagina 1di 11

PARISUTHAM INSTITUTE OF TECHNOLOGY AND SCIENCE

DEPARTMENT OF INFORMATION TECHNOLOGY

IT2302 Information Theory and Coding Technical Questions and Answers

1. New abbreviation of Binary digit is represented by _______ a) Binit b) bin c) Digi d)bit 2. Unit of information is _______ a) Bytes b) bytes/message c) bits d) bit 3. In more uncertainty about the message, information carried is _____ a) less b) more c) very less d) both a&b 4. If receiver knows the message being transmitted, the amount of information carried is_______ a) 1 b) 0 c) -1 d) 2 5. Amount of Information is represented by ______ a) IK b)pK c) 1/ pK d) H

6. Average information is represented by_____ a) Entropy b) code redundancy c) code efficiency d) code word 7. Average Information______ a) Total information b) Entropy No. of Message No. of Message c) Entropy d) Message Rate No. of Message No. of Message 8. Information Rate is represented by_____ a) r b) rH c) R d) rR 9. Source Coding Theorem is represented by_____ a) Huffmans 1st theorem b) Shannons 1st theorem c) Shannons 2nd theorem d) Both a&b 10. The codeword generated by the encoder should be ______ a) Digits in Nature b) Codes in Nature c) Binary in Nature d) Values in Nature

11. Coding Efficiency of the source encoder is defined as,______ a) = Nmin b) = H N N c) NH d) = H(X2) N 12. Code redundancy is represented by_____ a) b) 1- c) 2 d) 13. Code Variance is represented by_____ a) -1 b) Pk c) 2 d) 2 14. Variable length coding is done by source encoder to get______ a) Lower efficiencies b) Higher efficiencies c) Moderate efficiencies d) Both a&b 15. Prefix Code Satisfies_______ a) McMillan inequality b) Shannons 1st Theorem c) Huffman Coding d) Shannons 2nd Theorem

16. The Channel is Discrete, when Both X and Y are________ a) Analog b) Discrete c) discrete Analog d)Both a&b 17. The conditional entropy H(Y/X) IS Called______ a) Uncertainty b) Information c) Equivocation d) Certainty 18. Standard Probability of m p(xi,yj)=_______ i=1 a) p(xi) b) p(yj) c) p(xi,yj) d) p(yj, xi) 19. H(X,Y)= H(X/Y) +_______ a) H(X) b) H(Y) c) H(Y/X) d) H(X,Y) 20. H(X,Y)= H(Y/X) +______ a) H(X) b) H(Y) c) H(Y/X) d) H(X,Y)

21. H(X) = m pi log2 (_______) i=1 a) pi b) pk c) 1/pi d)1/ pk 22.. Average rate of information going into the channel is given as,_____ a) Din= H(X) b) Din= rH(X) c) Din= H(Y) d) Din= rH(y) 23. Average rate of information transmission Dt across the channel______ a) Dt = [H(X)-H(X/Y)] b) Dt = [H(Y)-H(X/Y)] c) Dt =[H(X)-H(X/Y)]r d) Dt = [H(X)+H(X/Y)] 24. In case of errorless transmission H(X/Y)=0,Hence Din=________ a) H(X) b) Dt c) H(Y) d) rH(X) 25. Mutual Information is represented as,________ a) I(X/Y) b) I(X;Y) c) I(X,Y) d) I(X:Y)

26. The mutual information is Symmetric_______ a) I(X;Y)= I(X,Y) b) I(X;Y)= I(Y:X) c) I(X;Y)= I(X:Y) d) I(X;Y) = I(Y;X) 27. I(X;Y) = H(X)_______ a) - H(X) b) - H(X/Y) c) - H(Y/X) d) - H(X,Y) 28. I(X;Y) = H(Y)_______ a) - H(X) b) - H(X/Y) c) - H(Y/X) d) - H(X,Y) 29. Mutual information is always_______ a) +ve b) ve c) 0 d) Both a&c 30. I(X;Y) is related to the joint entropy H(X,Y) by_______ a) I(X;Y)= H(X) H(X,Y) b) I(X;Y)= H(X) + H(X,Y) c) I(X;Y)= H(X) +H(Y) H(X,Y) d) I(X;Y)= H(X)- H(Y) H(X,Y)

31. Channel Capacity of the discrete memoryless channel is_____ a) C = max b) C = max P(Xi) I(X;Y) P(Yj) I(X;Y) c) C = max d) C = max P(Xi) I(X:Y) P(Xi) I(Y;X) 32. Channel matrix is otherwise is called as______ a) Probability Matrix b) Transition Matrix c) Probability Transition Matrix d) None 33. (Entropy) H=0, if PK=_______ a) 0 b) 1 c) -1 d) Both a&b 34. DMS of Entropy H(S), the average codeword length of a prefix code is bounded as H(S) <________ a) [H(S)- 1] b) [H(S)=1] c) [H(S)+1] d) [H(S)*1] 35. A Prefix Code is uniquely decodable , & thereby satisfying_____ a) Kraft McMillan Inequality b) Shannon Fano c) Huffman d) Extended Huffman

36. When Channel is noise free, set p=_____ a) -2 b)0 c) 2 d)-1 37. When Channel is error, p=_______ a) b) c) d)1 38. Which is the difference between the analog signal and the digital representation,__________ a) Threshold b) Quantization Noise c) Channel Capacity d) DMC 39. Which Audio Encoding is the encoding of audio signals,________ a) Perceptual b) Analog c) Digital signal d) Random Signal 40. MPEG Stands for _________ a) Media Picture Experts Group b) Motion Picture Experts Group c) Media Picture Export Group d) Media Video Experts Group

41. MPEG-1: a standard for storage and retrieval of moving pictures and _______ a) Audio b) Message c) Channel d) Gallery 42. MPEG-2: standard for ________ television a) Analog b) Digital c) UV Rays d) LASER 43. MPEG-4: a standard for ________applications a) Multimedia b) Business c) Hospital d) Army 44. MPEG-7: a content representation standard for_______ a) Threshold b) Information search c) Quantization Noise d) Channel Capacity 45. The aim of Linear Predication Analysis (LPC) is to estimate V(z) from the______ a) Speech signal b) Video signal c) Digital signal d) None

46. JPEG is an image compression standard which was accepted as an international standard in ______ a) 1993 b) 1994 c) 1992 d) 1995 47. Features of H.261 Source format ______ a) CIF & QCIF b) TIF & QCIF c) CIF & TIF d) None 48. Input message = Output message is _______ a) Lossy b) Lossless c) Both a&b d) None 49. Input message Output message is _______ a) Lossy b) Lossless c) Both a&b d) None 50. Lossy does not necessarily mean loss of _______ a) Data b) quality c) Both a&b d) None

51. A measure of information content is ______ a) Channel Capacity b) Entropy c) DMC d) None

52. I frame is represented as _______ a) bi-directional predictive-coded b) intra coded c) predictive-coded d) inter coded

53. P frame is represented as _______ a) inter coded b) bi-directional predictive-coded c) predictive-coded d) Both a&c

54. B frame is represented as _______ a) inter coded c) predictive-coded d) bi-directional predictive-coded 55. Which is a set of consecutive frames that can be decoded without any other Reference frames______ a) Single Pictures c) Single video b) Group of Pictures d) Group of Videos b) intra coded

Potrebbero piacerti anche