Sei sulla pagina 1di 16

Systems for Digital Signal

Processing
4 – Discrete-Time Random Processes
• Basic definitions
• Independence and strict-sense stationarity (SSS)
• Expectation and statistical moments
• Uncorraletedness and wide-sense stationarity (WSS)
• Ergodic processes
• Representation of WSS processes in the frequency domain
• Transformation of WSS processes through LTI systems

1
Definitions - 1
Def: A discrete-time random process is an order set of random variables
xn where n denotes time in this case.

Def: A random variable is a variable whose value is subject to random


variations. A random variable can take on a set of possible different values,
each with an associated probability.
Two types of random variables (r.v.) exist
Example
f xn
• Continuous r.v. are described by probability
density functions (pdf)
xn
• Discrete r.v. are described by probability mass
p xn
functions (pmf)

• Every point of a pdf or a pmf describes the


probability that the r.v. takes on a given value xn
2
Definitions - 2
Def: A cumulative distribution function (cdf) is defined as:
k
Fxn  Prxn  x Fxn  Prxn  x  p
x


f xn d xn
i
xni

Continuous r.v. Discrete r.v.


Examples
Fxn Fxn
1 1

xn xn
• The cumulative distribution functions grow monotonically between 0 and
1. In the case of discrete r.v. the cdf exhibits a stair-case shape. The step
amplitude is equal to the corresponding pulse amplitude in the pmf

• Hybrid r.v. (partially discrete and partially continuous) may also exist
3
Definitions - 3
DT
random Generation of random sequences for any n
process
Each sequence is a «realization» of the random process
• The relationship between 2 or more r.v. extracted from a sequence is
described by the
– Joint probability density functions (continuous r.v.) f xn xm ...xs
– Joint probability mass functions (discrete r.v.) p xn xm ...xs
• These functions give the probability that a tuple of r.v. takes on a given set
of values within the specified range.
Joint cdf (2 variable case)

 
y x
Continuous r.v. Fxn xm  Pr xn  x , xm  y    f
y x
xn xm d xn dxm

Discrete r.v. Fxn xm  Prxn  x , xm  y  p xni xm j


j i
4
Independence and stationarity
• Two or more r.v. are statistically independent if the realization of each
r.v. does not affect the probability distribution of the others. This happens if
and only if
f xn xm ...xs  f xn  f xm  ... f xs or pxn xm ...xs  pxn  pxm  ... pxs

• A random process consists of independent and identically distributed


(i.i.d.) variables if a all of them are statistically independent and the pdf (or
pmf) of all the r.v. is the same

• In general, both individual and joint pdf/pmf may change over time.

• However, a random process is strict-sense stationary (SSS) if all joint


pdf/pmf do not change for any shift of the time origin
fx  fx
n nk
k f xn independent of n
f xn xm  f xn  k xm  k k f xn x m depend only on n-m
... ...
5
Moments of a random variable
• Def: r-th order (raw) statistical moment of a r.v. extracted from a process
 

nr  E x 
r 


xnr f xn dxn continuous r.v. Generally change with time,
but in SSS processes all
x
n r
 ni p xn discrete r.v. moments of 1 r.v. are constant
 i
i

Expectation operator
The mean of a r.v. provides
Special case:  n for r=1 mean value
the long-run average of the
variable, or the expected average
over many observations.
• Def: r-th order central statistical moment of a r.v. extracted from a process

 x 
  r
  n f xn dxn continuous r.v.
 
n
 r
M n  E  xn   n   
r 

  
 x r
ni   p xn discrete r.v.
 i
i
The variance of a r.v.
  n2 for r=2 variance
2 measures the spread, or
Special case: M n variability, of the distribution
6
Properties of expectation
1. The expectation is a linear operator. If α and β are real coefficients


E  xn   xm   Ex  Ex 
n m

2. The expectation of a constant (i.e. deterministic) value k is k

Ek   k
Example power

       
 n2  E xn  n 2  E xn2  n2  2n E xn  E xn2  n2

3. If g(∙) is a deterministic function, then



Eg xn    g xn  f xn dxn


7
Joint moments of 2 random variables
• Def: Joint (raw) statistical moment of order r+k of two r.v. extracted from a
process
   r k
nm
rk
   
 E xnr xmk  

  
xn xm f xn xm dxn dxm continuous r.v.

 

x r k
ni xm j p xni xm j discrete r.v.
 j i
Special case: for r=k=1 correlation of two r.v.nm   xn , xm   x n , m

• Def: Joint central statistical moment of order r+k of two r.v. extracted from a
process

    
   r k
   xn  n xm  m f xn xm dxn dxm
 E  
r
 k  
   
 
rk

 
M nm  n
x n x m m   k continuous r.v.
   r
x n   n x m   m p xn xm

i j i j
 j i discrete r.v.

Special case: for r=k=1 covariance of two r.v. M nm  C xn , xm   Cx n , m


8
Uncorraletadness
• For any pair of r.v. it can be easily proved from the definition of covariance
and from the properties of expectation that

 
Cx n , m  E xn  n xm  m    n, m   
x n m

• Def. The correlation coefficient of two r.v. extracted from a process is


C x n , m 
 nm 
 n m
• Def. Two r.v. are uncorrelated if any of the following conditions is true
Cx n , m  0 or  nm  0 or x n , m  Exn xm  n m
• Theorem: if two r.v. are independent they are also uncorrelated
Proof:


E xn xm   
 

 
xn xm f xn xm dxn dxm  

xn f xn dxn 


  
xm f xm dxm  E xn E xm

If two r.v. are uncorrelated they are not necessarily independent!


9
Wide-sense stationarity
• A process is wide-sense stationary (WSS) if just the 1st order moment
and the covariance do not vary with respect to time, i.e.

 
n  E xn    const .
C x n , m   C x n  m 

• Of course any SSS process is also WSS, but not vice versa.
Properties of the correlation sequence associated with a
real-valued WSS process
1. x m   x  m  m
 
2. x 0  E xn2
3. x m   x 0
• The correlation sequence of r.v. extracted from the same process is
typically called autocorrelation. When instead the correlation sequence is
computed using r.v. from 2 different processes is called crosscorrelation.
10
Ergodicity
• A SSS process is ergodic if its
statistical moments can be N

    E xn 
1
deduced from a single, sufficiently Nlim xn 
  2 N  1
n N
long sample (realization) of the
 
N

2N  1 
process 1
lim xn2 
 p x  E xn2
N  
n N
• The time averages computed on
 
N

  x n  m   E xn xm
1
an individual realization converge lim xn xn  k 
N   2 N  1
asymptotically to the ensemble n N
averages xn  ...
Time average

x1
A process which is
t1 t2 t ergodic in the first
x2 and second moments
only is sometimes
x3 called ergodic in the
wide sense
Ensemble average

11
Representation of WSS processes
in the frequency domain
• The DTFT of random sequences does not exist (they have infinite
energy), although the DTFT of realization with a finite duration can be
computed.

• However, if we consider a WSS process, the r.v. xn and xn+m tend to be


uncorrelated when m ∞, i.e.
lim x m   0
m

• Given that x m  x 0 the autocorrelation sequence has typically


finite energy and sometimes it is even absolutely summable. Therefore

PSD is an even
   me
Power spectral 
density or power  x e j  x
 jm
nonnegative
density spectrum m   function

12
Meaning of Power Spectral Density
 
 x e j

E    0  
xn2 1
2



 
 x e j d
-π ωN π ω

• White noise: stationary random process with mean = 0 and constant


PSD
 
 x e j x m 
η F-1
x 0   x2

-π π ω m

• Subsequent samples of white noise are uncorrelated. In the time


domain this means the samples fluctuate quickly.

13
Trasformation of WSS processes
through an LTI system - 1
LTI

x n  H (e j ) y n 

Realization of a WSS Realization of a WSS


process with mean value μx process with mean value μy
and correlation ϕx(m) and correlation ϕy(m)
Proof.
    
  
 y  E yn  E  

hk  x n  k   hk Ex n  k  x   
hk    x H e j 0

k  
 k  k  

constant and
independent of time
14
Trasformation of WSS processes
through an LTI system - 2
   
 y n , n  m   Eyn  yn  m   E 
k   r  
h k h r  x n  k  x n  m  r  

   
   hk hr Exn  k  xn  m  r    hk hr  k  m  r 
k   r   k   r  
x

Assuming that l  r  k
  
 y n , n  m    m  l   hk hl  k    m  l cl    m* cm
x x x
}
l   k   l  
Aperiodic autocorrelation of h(n)
(not to be confused with statistical autocorrelation)

• The final convolution depends just on the time difference between the
realizations of the output r.v. Therefore, y(n) is also WSS.

15
Trasformation of WSS processes
through an LTI system - 3
 y m  x m* cm  x m* hm* h m

F
y e     e  H e  H e   H e 
j
x
j j * j j
2
 
  x e j
Example

He  j  2

E y n    y 0   1

0

 
 y e j d 
Δω Δω

     d    
 2 0  2
 1  x e j H e j 1
   
 x e j
d
0 0 2
ω0
  e 
-ω0 0
 j0

x

Note: If the input process is white noise, the output is colored as the r.v.
resulting from the filtering operations are correlated.
16

Potrebbero piacerti anche