Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
I. Introduction
A communication system can be modeled by a block diagram
Figure 1: Block diagram of communication system We will attack the following three basic problems: The Communication Problem:
What is the massage?
H0
H1
Examples: Morse symbols - dots/dashes. Compact disk - bin/no bin. Laser communication- Pulse of light yes/no
P( H1 ) = p1
5
Related Problems Optimal signal under some constraints (bandwidth, energy, power) M-ary detection problem Detection with unknown parameters
6
P( Decide H1 | H 0 )
P( Decide H 0 | H1 )
7
Examples:
Delay : r(t)= s(t-D)+ n(t) , D is unknown Phase estimation: r(t)=A Cos(wt+)+n(t), is unknown
9
% % E (( )( )T )
% )( )T ) > min % E ((
10
x(t ) = ai s (t iT )
i
r (t ) = x (t ) * h (t ) + n (t )
11
The Source
Binary Memoryless Source-BMS A={0,1}
Each unit of time, T seconds, the source emits a random variable which is independent of all prceeding and succeeding outputs
s = {s1 , s2 ,..., s3 ....}, si A = (0,1), i = 1,..., p( si = a | s \ si ) = p( si = a) s = (101011011011....)
Discrete Memoryless Source-DMS s = ( s1 , s2 ,...si ,....), i = 1,..., si A = {a1 , a2 ,...aq } p ( si | s \ si ) = p( si ) A = {3, 1,1,3} s = (3, 1,3,1,1, 3,1, 1, 3,3,....)
12
Entropy of Source
The average amount of information per source output is defined as
q
H ( A) = pi log( pi )
i = q
The source with equiprobable output symbols 1 pi = Pr ob( sn = ai ) = | A| have the greatest entropy
H ( A) = log | A |
13
A simple code
A = 00 B = 01 C = 10 D = 11
Can we do better?
What about this code?
A =1 B = 01 C = 001 D = 000
Entropy
0 H(A) log (|A|), with equality iff
pi = Pr ob( sn = ai ) = 1 | A|
1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 0.6 0.8 1 Prob(a=0) Entropy
log(M) is the maximum amount of bits that are required to represent the output in bits. H(A) is a convex function
16
The Transmitter
A transmitter is a mapping from the message set A to a set of waveforms.
ai si (t ), 0 t T
The waveforms si(t), i=1,..,Q, have finite duration of length T For a DMS with a uniform letters probabilities. The transmitter rate is
18
FSK Transmitter
p0 (t )
p0 (t )
p1 (t )
f1
p1 (t )
p1 (t ) + p0 (t T ) + p1 (t 2T ) + p1 (t 3T ) + p0 (t 4T ) + ...
19
Ei = s 2i (t )dt Emax
0
Si ( f ) = F ( si (t )) Amax for f [ f1 , f 2 ]
20
s(t)
y(t)=s(t)*h(t)
y (t ) =
h ( ) s (t )d = F 1[ H ( w) S ( w)]
21
23
The Noise We will assume a random process, denoted by n(t), which is added to the received signal and is statistically independent on the channel and the transmitted signal Examples
White Gaussian Noise Colored Gaussian Noise
24
The Receiver
The receiver estimates the transmitted information based on the received signal
r (t ) = y (t ) + n (t ) = s(t ) * h (t ) + n (t )
Cost Function- probability of error. If N messages were transmitted and the receiver made N errors.
e
Ne Pe = lim N N
Ne
25
Main parameters In the design of a communication system we are interested in the following parameters Pe R S/N BW - The error probability - The source rate in bits/sec - Signal-to-noise ratio. - Channel bandwidth constraint
26
p (i , k )
i =1 Q i =1
W ( D ( r = i ), k )
= pk p(i | k ) W ( D ( r = i ), k )
k =1
p (i, k ) = Pr ob(the receiver signal is i , k was transmitted) p (i | k ) = Pr ob(the receiver signal is i | k was transmitted) pk = The aprior probability that k was transmitted D ( r = i ) Decsion rule when r=i
27
E[W (i, k )] =
k =1 Q
i =1 Q
p(i, k ) W ( D( r = i ), k )
= pk p (i | k ) W ( D(r = i ), k )
k =1 Q i =1
= pk
k =1
D ( r =i ) k
i =1
p (i | k ) = Pe
28
The objective of the optimal receiver is to assign for each received symbol a decision rule to minimize the error
29
1-p
xxc p
1-p
1-p
xxc p
D(X) =0
1-q
1-p
or P( r = x | 0) 1 p 1 q p1 = = < P( r = x | 1) p q p0
D ( x ) =1 D ( x ) =0
32
or 1 q p1 P( r = y | 0) p = = P( r = y |1) 1 p < q p0
D ( x ) =1 D ( x ) =0
33
Pe = p( x, k ) W ( D( x ), k ) + p ( y , k ) W ( D ( y ), k )
k =0
= p0 [ p( x | 0) W ( D ( x ),0) + p ( y | 0) W ( D( y ),0)] + p1[ p ( x |1) W ( D ( x ),1) + p ( y |1) W ( D ( y ),1)] = p0 p ( x | 0) W ( D ( x ),0) + p1 p ( x |1) W ( D ( x ),1) + p0 p ( x | 0) W ( D( y ),0) + p1 p ( y | 0) W ( D ( y ),0)
34