Sei sulla pagina 1di 35

1

A random variable is a number chosen at


random as the outcome of an experiment.
Random variable may be real or complex and
may be discrete or continuous.
In S.P. ,the random variable encounter are most
often real and discrete.
We can characterize a random variable by its
probability distribution or by its probability density
function (pdf).
2
The distribution function for a random variable y
is the probability that y does not exceed some
value u,


and
3
) ( ) ( u y P u F
y
s =
) ( ) ( ) ( u F v F v y u P
y y
= s <
The probability density function is the derivative
of the distribution:


and,





4
) ( ) ( u F
du
d
u f
y y
=
}
= s <
v
u
y
dy y f v y u P ) ( ) (
1 ) ( =
y
F
1 ) ( =
}
+

dy y f
y
We can also characterize a random variable by
its statistics.
The expected value of g(x) is written E{g(x)} or
<g(x)> and defined as
Continuous random variable:


Discrete random variable:



5
}
+

>= < dx x f x g x g ) ( ) ( ) (

>= <
x
x p x g x g ) ( ) ( ) (
The statistics of greatest interest are the moment
of p(x).
The kth moment of p(x) is the expected value of
.
For a discrete random variable:
6
k
x

>= =<
x
k k
k
x p x x m ) (
The first moment, , is the mean of x.
Continuous:

Discrete:

The second central moment, also known as the
variance of p(x), is given by

7
1
m

>= =< =
x
x xp x x ) (
}
+

= dx x xf x ) (
2
2
2 2

) ( ) (
x m
x p x x
x
=
=

o
To estimate the statistics of a random variable,
we repeat the experiment which generates the
variable a large number of times.
If the experiment is run N times, then each value x
will occur Np(x) times, thus
8

=
=
N
i
i x
x
N
1
1

=
=
N
i
k
i k
x
N
m
1
1

A random variable has a uniform density on


the interval (a, b) if :







9

s s
=
otherwise , 0
), /( 1
) (
b x a a b
x f
x

>
s s
<
=
b x
b x a a b a x
a x
x F
x
, 1
), /( ) (
, 0
) (
2 2
) (
12
1
a b = o
10
The gaussian, or normal, density function is
given by:

2 2
2 / ) (
2
1
) , ; (
o
t o
o

=
x
e x n
If two random variables x and y are to be
considered together, they can be described in
terms of their joint probability density f(x, y) or,
for discrete variables, p(x, y).

Two random variable are independent if



11
) ( ) ( ) , ( y p x p y x p =
Given a function g(x, y), its expected value is
defined as:
Continuous:


Discrete:

And joint moment for two discrete random variable is:
12
} }


>= < dxdy y x f y x g y x g ) , ( ) , ( ) , (

>= <
y x
y x p y x g y x g
,
) , ( ) , ( ) , (

=
y x
j i
ij
y x p y x m
,
) , (
Moments of a random variable X



] [ ]. [ ] [
) ( ] [
) ( ) , ( ) , ( ] [
) , ( ] [
, ,
,
Y E X E XY E
dy y yf Y E
dx x xf dydx y x f x dxdy y x xf X E
dxdy y x f y x y x E
y
X Y X Y X
Y X
n n n n
=
=
=
(

= =
=
}
} } } } }
} }
+

+

+

+

+

+

+

+

If X and Y are independents and in this case
) ( ). ( ) , (
,
y f x f y x f
Y X Y X
=

14
.. .
18

19

20

21
Two random variables x and y are jointly
gaussian if their density function is :



Where
22
y x
xy
xy
r
o o
o
=
(
(

|
|
.
|

\
|
+

=
2
2
2
2
2
2
2
) 1 ( 2
1
exp
1 2
1
) , (
y y x x
y x
y rxy x
r
r
y x n
o o o o
o to
A random function is one arising as the
outcome of an experiment.
Random function need not necessarily be
functions of time, but in all case of interest to us
they will be.
A discrete stochastic process is characterized
by many probability density of the form,
23
) ,..., , , , ,..., , , (
3 2 1 3 2 1 n n
t t t t x x x x p
If the individual values of the random signal
are independent, then


If these individual probability densities are all
the same, then we have a sequence of
independent, identically distributed samples
(i.i.d.).

24
) , ( )... , ( ) , ( ) ,..., , , ,..., , (
2 2 1 1 2 1 2 1 n n n n
t x p t x p t x p t t t x x x p =
Mean and autocorrelation can be determined in
two ways:
The experiment can be repeated many times and the
average taken over all these functions. Such an
average is called ensemble average.
Take any one of these function as being
representative of the ensemble and find the average
from a number of samples of this one function. This
is called a time average.
25
If the time average and ensemble average of a
random function are the same, it is said to be
ergodic.

A random function is said to be stationary if its
statistics do not change as a function of time.

Any ergodic function is also stationary.
26
In stationary signal we have:


Where
And the autocorrelation function is :
27
x t x ) (
) , , ( ) , , , (
2 1 2 1 2 1
t x x p t t x x p
1 2
t t = t

=
2 1
,
2 1 2 1
) , , ( ) (
x x
x x p x x r t t
When x(t) is ergodic, its mean and
autocorrelation is :
28

=

=
N
N t
N
t x
N
x ) (
2
1
lim
) ( ) (
1
lim ) ( ) ( ) ( t t t

=

>= =<
N
N t
N
t x t x
N
t x t x r
The cross-correlation of two ergodic random
functions is :


The subscript xy indicates a cross-correlation.
29

=

>= =<
N
N t
N
xy
t y t x
N
t y t x r ) ( ) (
1
lim ) ( ) ( ) ( t t t
The Fourier transform of (the
autocorrelation function of an ergodic random
function) is called the power spectral density of
x(t) :

The cross-spectral density of two ergodic random
function is :
30

=
t
et
t e
j
e r S ) ( ) (

=
t
et
t e
j
xy xy
e r S ) ( ) (
) (t r
For ergodic signal x(t), can be written as:

Then from elementary Fourier transform properties,
31
2
| ) ( |
) ( ) (
) ( ) ( ) (
e
e e
e e e
X
X X
X X S
=
=
=
-
) (t r
) ( ) ( ) ( t t t - = x x r
If all values of a random signal are
uncorrelated,

Then this random function is called white noise
The power spectrum of white noise is constant,


White noise is mixture of all frequencies.
32
) ( ) (
2
t o o t = r
2
) ( o e = S

33

34

35

Potrebbero piacerti anche