Sei sulla pagina 1di 17

Communications Engineering (EE321B) Entropy, Energy and Systems

Lecture 5
March 28, 2006

Communications Engineering (EE321B)

3/28/2006

Overview

Entropy, Thermodynamics and Information (Lecture 5)


(Haykin Chapter 9) Thermodynamics Information Entropy Relevance to communications

Communications Engineering (EE321B)

3/28/2006

2nd Law of Thermodynamics


Thermodynamic entropy of an isolated system is non-decreasing over time Entropy ~ log of # of states

LLLLLLLLLLLLLLLLLLLL

LRLLLRRLLRRRLRLRLRL

Communications Engineering (EE321B)

3/28/2006

Entropy

Entropy

Binary entropy function

0.9 0.8 0.7 0.6 H(p) 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 p 0.6 0.8 1

General case

Sterlings formula

Communications Engineering (EE321B)

3/28/2006

Poincars Infinite Recurrence

Communications Engineering (EE321B)

3/28/2006

Arrow of Time

(Almost) all physical theories have time symmetry


Newtonian mechanics General relativity Quantum mechanics Why glass breaks, but never spontaneously re-assembles itself? Why we feel time flows from past to future? Why do we remember the past but not the future? Otherwise life cannot exist and we would not exist to ask the question

Then, why do we perceive the arrow of time?


Because we are living in a low-entropy condition


We simply call lower entropy state past and higher entropy state future

Communications Engineering (EE321B)

3/28/2006

Arrow of Time

Entropy

Future

Past

Past

Future

Communications Engineering (EE321B)

3/28/2006

Information

Why care about entropy?


More randomness More possible states More unpredictability Higher entropy

More information

More increase in your knowledge when you receive the information

Communications Engineering (EE321B)

3/28/2006

Randomness, Entropy and Information


How random are the following sequences?


000000000000000000000000000000 010101010101010101010101010101 000100000100010000000010000010 011001010111010100001011011101 Repeat 0 30 times Repeat 01 15 times Print 011001010111010100001011011101 Less information Less randomness Lower entropy

Which ones are easier to describe?


Shorter description

Communications Engineering (EE321B)

3/28/2006

Entropy

Entropy: Measure of uncertainty, randomness Tossing of a fair coin: 1 bit of information Tossing of two fair coins: 2 bit of information Tossing of a fair dice: bit of information

1,2,3,4,5,6: 11, 12, 13, , 16, 21, 22, , 66: 111, 112, , 666: Asymptotically

Bent coin?

Communications Engineering (EE321B)

10

3/28/2006

Binary Source Example


How do you compress an i.i.d. Bernoulli(p) sequence?


000100000100010000000010000010 (n bits long)

Specify

= # of ones and specify index among


bits

Need

Sterlings formula

Communications Engineering (EE321B)

11

3/28/2006

Huffman Codes

Example
X 1 2 3 4 5 C(X) 00 10 11 010 011 p(X) 0.3 0.25 0.2 0.15 0.1
0 1

0.3 0.25 0.25 0.2


0 1

0.45 0.3 0.25


0 1

0.55 0.45

0 1

This algorithm is in fact optimal

Communications Engineering (EE321B)

12

3/28/2006

Modern Digital Communication


Two-stage architecture

Source coding: Compression (e.g., MP3, JPEG) Channel coding: Error correction

Originally proposed by Shannon in 1948 Used in almost all modern digital communication systems

Information Source

Source encoder

Channel encoder Noisy channel

Information Sink

Source decoder

Channel decoder

Communications Engineering (EE321B)

13

3/28/2006

Source Coding
Alice drank from the bottle that said COMPRESS ME. 1010110101001 0001011011010 1000110101....

Lossy compression

Question: What is the best we can do?


Minimum compression size for a given distortion? Minimum distortion given a compressed size?
14 3/28/2006

Communications Engineering (EE321B)

Source Coding

Without source coding, none of the following would have been successful

Digital cell phones MP3 players, PMP Digital multimedia broadcasting (DMB) Voice over IP Video on demand Digital cameras, digital camcorders 5:1 compression for text 10:1 compression for audio 100:1 compression for video

Typical numbers

Communications Engineering (EE321B)

15

3/28/2006

Channel Coding

Channel coding is also essential Typically more than 10 times increase in capacity with channel coding Without source & channel coding, capacity would reduce to 1/100 ~ 1/1,000 No digital revolution would have been possible without source and channel coding!

Communications Engineering (EE321B)

16

3/28/2006

Analog vs. Digital


Telephone

Can carry ~ 10 digital voice calls over one phone line Channel BW: 4 KHz Source BW of analog signal: 3.4 KHz Channel capacity: ~ 50 Kbps Source rate before compression: 8 bits * 8 KHz = 64 Kbps Source rate after compression: ~ 5 Kbps Can carry ~ 6 digital video streams over one video channel Channel BW: 6 MHz Source BW of analog signal: 4MHZ Channel capacity: ~ 60 Mbps Source rate before compression: ~ 200 Mbps Source rate after compression: ~ 10 Mbps

Video

Communications Engineering (EE321B)

17

3/28/2006

Potrebbero piacerti anche