Ss
Elements of
Information
Theory
Solutions Manual
Thomas M. Cover Joy A. Thomas
Department of Electrical Engineering IBM T.J. Watson Research Center
Stanford University, P.O. Box 704
Stanford, CA 94305 Yorktown Heights, NY 10598
email: cover @is]
tanford.edu email: jat @watson.ibm.com
February 13, 1992—
Contents
1 Introduction 3
2 Entropy. Relative Entropy and Mutual Information 7
3 The Asymptotic Equipartition Property 39
4 Entropy rates of a stochastic process "
5 Source coding 67
6 Gambling and Data Compression 8a
7 Kolmogorov Complexity ae
8 Channel Capacity ne
9 Differential Entropy ’
10 The Gaussian Channel =
11 Maximum Entropy and Spectral Estimation a
12 Information Theory and Statistics Re
18 Rate Distortion Theory a
34 Network Information Theory .
15 Information Theory and the Stock Market if
+6 Inequalities in Information Theory
eePreface
‘The problems in the book, “Elements of Information Theory”, were chosen from the problens
used during the course at Stanford. Most of the solutions here were prepared by the graders
and instructors of the course. We would particularly like to thank Prof. John Gill, Derid
Evans, Jim Roche and Laura Ekroot for their help in preparing these solutions
Most of the problems in the book are straightforward, and we have included
the problem statement for the difficult problems. In some cases, the solutions include
material of interest (for exemple, the problem on coin weighing on Pg. 23 and the problem cx
gambling with unfair odds on Pg 90). Exceptions to the general simplicity of the prod
include Problems 2.4, 6.7 and 6.8 (both of which require some numerical optimization on 2
computer) and 13.10.
We will take this opportunity to point out a few errors in the first printing of the book
In Problem 2.35, the problem should read “increases” rather than “decreases”. In problem
4.7, the assumption of stationarity is not required. In problem 4.14(b), the “
be replaced by “<”. In problem 4.14(c), the process should be stationary and ergod
problem 7.6, the problem should only ask to show that the output entropy of the comp!
infinite, not that it is equal to the input entropy. More detailed explanations of these
can be found in the appropriate solutions.
We would appreciate any comments, suggestions and corrections to this Solutions Mat
ual.
Tom Cover
Joy Thomas
Durand 121, Information Systems Lab IBM TJ. Watson Research Center
Stanford University P.O. Box 704
Stanford, CA 94305. Yorktown Heights, NY 10598.
Ph. 415-723-4505
PAX: 415-723-8473
Email: coverGisl stanford edu Em:
Ph. 914-784-6880
FAX: 914-784-7455
: jet@watson.ibm.com