Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Random Variables
Probability:
Symbol: P(⋅)
Example:
P ( X ≥ 5)
P(− 1 < X < 1)
P( X > 1) = P( X < −1) + P( X > 1)
Always True: 0 ≤ P( X ) ≤ 1
Example:
Two balls are drawn in succession without replacement from a box
containing 4 yellow balls and 3 green balls.
Example:
A fair coin was tossed twice. S = {TT, TH, HT, HH}.
1
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
X
S R X (Range of X)
Example:
Two balls are drawn in succession without replacement from a box
containing 4 yellow balls and 3 green balls.
Let X = “number of yellow balls”.
S x X(YY) = 2
X(YG) =1
YY 2 X(GY) = 1
YG 1 X(GG) = 0
GY
GG 0 Then R X = {0, 1, 2} .
Example:
A fair coin was tossed twice. S = {TT, TH, HT, HH}.
Let X = “number of head appears”
R X = {0, 1, 2}
2
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Random Variables
Continuous Random Variables
Example:
Two balls are drawn in succession without replacement from a box
containing 4 yellow balls and 3 green balls.
Let X = “number of yellow balls”.
S x X(YY) = 2
X(YG) =1
YY 2 X(GY) = 1
YG 1 X(GG) = 0
GY
GG 0 Then R X = {0, 1, 2} .
Example:
A fair coin was tossed twice. S = {TT, TH, HT, HH}.
Let X = “number of head appears”. R X = {0, 1, 2}
3
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
We can have f Y ( y ) ,
f Z ( z ) and etc.
Properties:
(1) f X ( x) ≥ 0
(2) ∑ f X ( x) =1 ∀x Given f X ( x) = kx , x = 1,2,3 . Find k.
x
(3) P( X = x ) = f X ( x ) 3
∑ f X (x) = 1 ∀x
x =1
k ⋅1 + k ⋅ 2 + k ⋅ 3 = 1
6k = 1
x 1
Given f X ( x ) = , x = 1,2,3 . k=
6 6
(i) Find P( X = 1) .
1
P( X = 1) = f X (1) =
6
(ii) Find P( X < 3) .
P( X < 3) = ∑ f X ( x )
x <3
= f X (1) + f X (2 )
1 2 1
= + =
6 6 2
(iii) Find P( X = 4 ) .
P( X = 4 ) = 0
4
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
A fair coin was tossed twice. S = {TT, TH, HT, HH}.
Let X = “number of head appears”
R X = {0, 1, 2} fX(x)
1
1
fX(0) = P(X = 0) = P((TT))=
4 2
1 1
2 1
fX(1) = P(X = 1) = P((TH),(HT))= = 4 4
4 2
1
fX(2) = P(X = 2) = P((HH))=
4
0 1 2 x
Example:
( )
Determine the value c so that the function f ( x ) = c x 2 + 4 for x = 0, 1, 2,
3 is a probability mass function of the discrete random variable X.
Solution:
From Property 2: ∑ f X ( x) =1 ∀x
x
3
∑ c(x 2 + 4) = 1
x =0
4c + 5c + 8c + 13c = 1
30c = 1
1
c=
30
5
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
FX ( x ) = P ( X ≤ x )
= ∑ f X (t ) for − ∞ < x < ∞
t≤x
Example:
A fair coin was tossed twice. S = {TT, TH, HT, HH}.
Let X = “numbers of head appears”. We know that R X = {0, 1, 2}.
If B is the event that “X ≤ 1”, then find
(a) P(B )
(b) FX ( x )
(c) Sketch the graph of cumulative distribution function FX ( x ) .
Solution:
1
1 2 3
(a) P(B ) = P( X ≤ 1) = ∑ f X (t ) = f X (0) + f X (1) = + =
t =0 4 4 4
(c)
FX ( x )
1
3/4
1/4
0 1 2 x
6
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
We can have f Y ( y ) ,
f Z ( z ) and etc.
= 0.625
P ( X = 6 ) = 0 WHY?
7
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
FX ( x ) = P ( X ≤ x )
x
=∫ f X (t )dt for −∞< x<∞
−∞
Example:
Given f X ( x ) = 0.25 , 0 < x < 4 . Find FX ( x ) .
0, x≤0
Answer: FX ( x ) = 0.25 x, 0 < x ≤ 4
1, x > 4.
WHY??????????
Case 2
Value x fall into the
The x values fall into this region region 0 < x ≤ 4 :
F X ( x ) = P( X ≤ x )
x
= ∫ 0.25dt
0
8
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
x1
X1
X2 x2
Example:
Two balls are drawn in succession without replacement from a box
containing 4 yellow balls and 3 green balls.
9
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Properties:
(1) f XY ( x, y ) ≥ 0 ∀( x, y )
(2) ∑∑ f XY ( x, y ) = 1
x y
(3) P ( X = x, Y = y ) = f XY ( x, y )
Example:
Given f XY ( x, y ) = kxy , ( x, y ) ∈ {(1, 2 ), (2, 1)}.
Find k.
∑∑ kxy = 1
x y
2k + 2k = 1
1
k=
4
2 2
∑∑ kxy = 1
y =1 x =1
1 Different.
k= WHY????
9
10
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Let X and Y be two discrete random variables with joint probability
x+ y
distribution f XY ( x, y ) = , for x = 0,1,2,3; y = 0,1,2 . Find FXY (1, 2 ) .
30
Solution:
F (1,2) = P[ X ≤ 1, Y ≤ 2]
2 1
= ∑ ∑f
y = −∞ x = −∞
XY ( x, y )
2 1
x+ y
= ∑∑
y =0 x =0 30
2
y 1+ y
= ∑ +
y = 0 30 30
1 1 2 2 3
= + + + +
30 30 30 30 30
9
=
30
3
=
10
11
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Find f X ( x ) or f Y ( y ) from f XY ( x, y ) .
f X ( x ) = P( X = x ) = ∑ f XY ( x, y )
y
f Y ( y ) = P(Y = y ) = ∑ f XY ( x, y )
x
Example:
xy
Given f XY ( x, y ) = , ( x, y ) ∈ {(1, 2 ), (2, 1)}. Find marginal probability
4
distribution of X alone.
Solution 1 Solution 2
xy 2
f X (1) = ∑ = 2
xy
y =2 4 4 f X (x ) = ∑
y =1 4
xy 2 x 2 x 3x
f X (2 ) = ∑ = = + = , x = 1, 2
y =1 4 4 4 4 4
1
f X ( x ) = , x = 1, 2
2
Which one is correct? Which one is correct?
Solution 1 or Solution 2?
12
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Conditional Probability:
Example:
xy
Given f XY ( x, y ) = , ( x, y ) ∈ {(1, 2 ), (2, 1)}. Find conditional probability
4
of Y given X = 1.
f XY ( x , y )
fY (y x) =
f X (x )
X
f (1, y ) y 1 y
f Y X ( y 1) = XY = = , y=2
f X (1) 4 2 2
13
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
0 < x < 1, 0 < y < 1.
y
x
0
1
Example:
Given a joint density function f XY ( x, y ) = 1 , 0 < x < 1 , 0 < y < 1 .
f XY ( x, y )
f XY (x, y ) = 1
y
x
0
14
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
IMPORTANT:
For discrete random variables:
f XY ( x, y ) = P( X = x, Y = y ) .
Example:
Given a joint density function f XY ( x, y ) = 1 , 0 < x < 1 , 0 < y < 1 .
Find P(0 < X < 1, 0 < Y < 1) .
f XY ( x, y )
y P(0 < X < 1, 0 < Y < 1) = 1 is
the volume bounded by the
surface f XY ( x, y ) and area
0 < x < 1, 0 < y < 1.
1
1 ⇒ P(0 < X < 1, 0 < Y < 1) = 1
x
0 1
15
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Given a joint density function f XY ( x, y ) = 1 , 0 < x < 1 , 0 < y < 1 .
Find P(0 < X < 0.5, 0 < Y < 0.5) .
0.5 0.5
P(0 < X < 0.5, 0 < Y < 0.5) = ∫ ∫0 1dxdy = 0.25
0
Example:
Given a joint density function f XY ( x, y ) = 2 , 0 < x ≤ y < 1 .
Find P(0 < X < 0.5, 0 < Y < 0.5) .
0.5 y
P(0 < X < 0.5, 0 < Y < 0.5) = P(0 < X < y, 0 < Y < 0.5) = ∫ ∫0 2dxdy = 0.25
0
16
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Symbol: FXY ( x, y )
Example:
Let X and Y be two continuous random variables with joint probability
distribution
3 2 2
( )
x + y , for 0 ≤ x < 1; 0 ≤ y < 1
f XY ( x , y ) = 2 .
0 , elsewhere.
1
Find FXY 1, .
2
1 1
Solution: FXY 1, = P X ≤ 1, Y ≤
2 2
1
2 1
= ∫ ∫ f XY ( x, y)dxdy
−∞ −∞
1 1 1
1
21
3 2 2
3 x 3
3 1 2
= ∫∫ (
x + y 2 dxdy ) = ∫
20 3
+ xy 2 dy = ∫ + y 2 dy
002 0 203
1
3 y y3 2
= +
2 3 3 0
5
=
16
17
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Find f X ( x ) or f Y ( y ) from f XY ( x, y ) .
∞ ∞
f X (x) = ∫ f XY (x, y )dy or fY ( y) = ∫ f XY (x, y )dx
−∞ −∞
Example:
Given a joint density function f XY ( x, y ) = 1 , 0 < x < 1 , 0 < y < 1 . Find
marginal probability density function of X alone.
∞
f X (x) = ∫ f XY (x, y )dy
−∞
1
= ∫ 1dy
0
=1 0 < x <1
18
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Conditional Probability:
Example:
32 1 1
( x + y − 2 xy ), 0 ≤ x ≤ ;0 ≤ y ≤
f XY ( x, y ) = 3 2 2
0, elsewhere.
19
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Probability
Sample space
Random variable
- Discrete
- Continuous
Probability mass function
Probability density function
Cumulative distribution function
Marginal probability mass function
Marginal probability density function
Conditional probability
20
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Independence
Recall from Math 1, the events A and B are said to be independent if and
only if P ( A B ) = P ( A) .
A card is drawn at random from a deck of 52, and its face value and suit
are noted. The event that an ace was drawn is denoted by A, and the
event that a club was drawn is denoted by B. There are four aces so
4 1 13 1
P ( A) = = , and there are 13 clubs, so P (B ) = = . A∩ B
52 13 52 4
denotes the event that the ace of clubs was drawn, and since there is only
1 1 1
one such card in the deck, P ( A ∩ B ) = = × = P ( A) × P ( B ) .
52 13 4
Thus,
1
P( A ∩ B ) 1
P(A B ) = = 52 = = P( A) .
P(B ) 1 13
4
In other words, knowing that the card selected was a club did not change
the probability that the card selected was an ace. We say that the event A
independent of the event B.
21
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example(discrete case):
Let
1
, (x1 , x2 , x3 ) ∈ {(− 1, − 1, 1), (1, 1, 1), (− 1, 1, − 1), (1, − 1, − 1)},
f X1 X 2 X 3 ( x1 , x 2 , x3 ) = 4
0, elsewhere.
0, elsewhere.
(b) We have
1 1
f X 1 (− 1) = f X 2 (− 1) = f X 3 (− 1) = , f X 1 (1) = f X 2 (1) = f X 3 (1) = .
2 2
The marginal probability distribution of X i is
1
, xi = −1,1,
f X i (xi ) = 2
0, elsewhere.
22
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
23
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
24
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Question (a):
If X 1 , X 2 and X 3 are independent, does it imply that X i and X j are
independent where i ≠ j ; i, j = 1, 2, 3 ?
Yes.
In discrete case:
For X 1 and X 2 , we have
f X1 X 2 (x1 , x 2 ) = ∑ f X1 X 2 X 3 ( x1 , x 2 , x 3 ) = ∑ f X 1 ( x1 ) f X 2 ( x 2 ) f X 3 ( x3 ) = f X 1 ( x1 ) f X 2 ( x 2 )
x3 x3
Question (b):
If X i and X j are independent where i ≠ j ; i, j = 1, 2, 3 , does it imply that
X 1 , X 2 and X 3 are independent?
No.
Refer to previous example (discrete case)
25
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Transformation of Variables
f X (x)
fY ( y )
Example:
Let X be a random variable with the following probability mass function,
161 (2 x + 1) , x = 0,1,2,3
f X ( x) = and Y = 2 X . Find f Y ( y ) .
0, elsewhere
y
3. The inverse function is x = .
2
4.
(
f Y ( y ) = P(Y = y ) = P(2 X = y ) = P X = y
2
)
1
( y + 1), y = 0, 2, 4,6,
= 16
0, elsewhere.
26
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Let X be a geometric random variable with probability distribution
x −1
2 3
f X ( x ) = , x = 1,2,3, … . Find the probability distribution function
55
of the random variable Y = 2 X 2 .
Solution:
1. The transformation maps the space R X = { 1, 2, ⋯} to
RY = {2, 8, ⋯}.
2. Since values of X are all positive, the transformation defines a one-
to-one correspondence between the values X and values of Y,
y = 2x 2 .
3. The inverse function of y = 2x 2 is x = y 2 .
y = 2x 2
One-
to-
x
Hence
23
y 2 −1
f
gY ( y ) = X
( )
y 2 =
55
, y = 2, 8, 18, ⋯
0, elsewhere
27
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
f X (x)
1. Transformation based on Y = g ( X ) maps the space
R X to RY . (Find RY )
2. Make sure that transformation based on Y = g ( X )
sets up a one-to-one correspondence between the
point of R X and those of RY .
3. Find the inverse function x = w( y ) .
4. Replace x in f X ( x ) by w( y ) . Finally form the
function f Y ( y )
fY ( y )
f X (x)
1. Transformation based on Y = g ( X ) maps the space R X
to RY . (Find RY )
2. Make sure that transformation based on Y = g ( X ) sets
up a one-to-one correspondence between the point of
R X and those of RY .
3. Find the inverse function x = w( y ) .
dx
4. From the inverse function, find the Jacobian (J) .
dy
5. Replace x in f X ( x ) by w( y ) then multiply the function
with modulus of Jacobian J . Finally form the
function f Y ( y )
fY ( y )
28
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Let X be a continuous random variable with probability distribution
function
1
f X ( x ) = 12
(
1 + x2 , ) 0 < x < 3,
0, elsewhere.
Find the probability distribution function of the random variable Y = X 2 .
Solution:
1. The one-to-one transformation y = x 2 maps the space {x 0 < x < 3}
onto the space {y 0 < y < 9}.
3. The inverse of y = x2 is x = y.
dx 1
4. We obtain Jacobian J = = .
dy 2 y
5. Therefore, f Y ( y ) = f X ( y ) J
=
1
(
1+ ( y ) ) 2 1 y
2
12
1+ y
= , 0 < y < 9.
24 y
29
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
e − x , x > 0,
f X (x ) =
0, elsewhere.
Find the probability distribution of the random variable Y = e − X .
Solution:
1. The transformation maps the space R X = {x x > 0} to
RY = {y 0 < y < 1}.
dx 1
4. Jacobian, J = =− .
dy y
5.
ln y 1
f X (− ln y ) J = e = 1, 0 < y <1
fY ( y ) = y
0,
elsewhere
30
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
How about y 2 ?
Is y 2 = 0,1,2,3,... since y 2 = x 2 and x2 = 0,1,2,3,... ? NO!
From y1 = x1 + x 2 and y 2 = x 2 , we have
1 y 2 = x 2 which means y 2 is always positive (since x2 is always
positive)
2 y 2 = y1 − x1 which means y 2 always take a value less that y1 (or
maximum value of y 2 is y1 ).
From 1 and 2 we get the range of y 2 : y 2 = 0,1,2,3,..., y1
31
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
µ1y1 − y2 µ 2y2 e − µ1 e − µ2
, ( y1 , y 2 ) ∈ R( y1 , y2 )
4. f Y1Y2 ( y1 , y 2 ) = ( y1 − y 2 )! y 2 !
0,
elsewhere
f X1 X 2 ( x1 , x2 )
1. Transformation based on Y1 = g1 ( X 1 ) and
Y2 = g 2 ( X 2 ) maps the space R( X1 , X 2 ) to R(Y1 ,Y2 ) .
(Find R(Y1 ,Y2 ) )
2. Make sure that transformation based on Y = g ( X ) sets
up a one-to-one correspondence between the point of
R( X1 , X 2 ) and those of R(Y1 ,Y2 ) .
3. Find the inverse function x1 = w1 ( y1 ) and x2 = w2 ( y 2 ) .
4. Replace x1 and x2 in f X1 X 2 ( x1 , x2 ) with w1 ( y1 ) and
w2 ( y 2 ) respectively. Finally form the function
f Y1Y2 ( y1 , y 2 )
f Y1Y2 ( y1 , y 2 )
Symbol: R( X1 , X 2 ) = R X1X 2
32
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
f X1 X 2 ( x1 , x2 )
1. Transformation based on Y1 = g1 ( X 1 ) and
Y2 = g 2 ( X 2 ) maps the space R( X1 , X 2 ) to R(Y1 ,Y2 ) . (Find
R(Y1 ,Y2 ) )
2. Make sure that transformation based on Y = g ( X ) sets
up a one-to-one correspondence between the point of
R( X1 , X 2 ) and those of R(Y1 ,Y2 ) .
3. Find the inverse function x1 = w1 ( y1 ) and x2 = w2 ( y 2 ) .
4. From the inverse function, find the Jacobian,
∂x1 ∂x1
∂y ∂y 2
J= 1 ≠ 0.
∂x2 ∂x2
∂y1 ∂y 2
5. Replace x1 and x2 in f X1 X 2 ( x1 , x2 ) with w1 ( y1 ) and
w2 ( y 2 ) respectively. Then multiply the function with
modulus of Jacobian J . Finally form the function
f Y1Y2 ( y1 , y 2 ) .
f Y1Y2 ( y1 , y 2 )
33
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Answer:
Solution:
The one-to-one transformation y1 = x1 + x 2 and y 2 = x 2 maps the space
R( X , X ) = {( x1 , x2 ) | 0 < x1 < 1; 0 < x2 < 1} onto the space
1 2
0,
1 2
elsewhere.
34
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Extended to
Extended to
Xi
Given Yi = , i = 1,2,⋯, k and
X 1 + X 2 + ⋯ + X k +1
Yk +1 = X 1 + X 2 + ⋯ + X k +1 , find the joint probability distribution function
of f Y1Y2⋯Yk +1 ( y1 , y 2 ,⋯, y k +1 ) .
35
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Solution:
1. The transformations maps the space
R X1 X 2⋯ X k +1 = {( x1 , x 2 ,⋯, x k +1 ) 0 < xi < ∞, i = 1,2,⋯, k + 1} to
RY1Y2⋯Yk +1 = {( y1 , y 2 ,⋯, y k +1 ) y i > 0, y1 + y 2 + ⋯ + y k < 1, 0 < y k +1 < ∞}
xi
2. The transformations yi = , i = 1,2,⋯, k and
x1 + x2 + ⋯ + xk +1
y k +1 = x1 + x2 + ⋯ + xk +1 , set up a one-to-one correspondence
between the points of R X and those of RY (one-to-one
transformation).
y k +1 0 ⋯ 0 y1
0 y k +1 ⋯ 0 y2
4. Jacobian, J = ⋮ ⋮ ⋮ ⋮ = y kk+1 .
0 0 ⋯ y k +1 yk
− y k +1 − y k +1 ⋯ − y k +1 (1 − y1 − ⋯ − y k )
36
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
For each of the range Ai , you can find one function in terms of y. Finally
sum up all the functions in terms of y if the range of Y is the same, then it
will form the function f Y ( y ) .
Example:
x2
1 −2
Given f X ( x ) = e , − ∞ < x < ∞ . Find f Y ( y ) if Y = X 2 .
2π
Solution
Clearly the transformation y = x 2 is NOT one-to-one.
y
y = x2
A1 A2
37
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
1
4. Jacobian, J = − .
2 y
y
1 −2 −1
5. gY ( y ) = f X (− y ) J = e , 0< y<∞
2π 2 y
1
9. Jacobian, J = .
2 y
y
1 −2 1
10. hY ( y ) = f X ( y ) J = e , 0< y<∞
2π 2 y
Finally,
f Y ( y ) = gY ( y ) + hY ( y ) , 0 < y < ∞
38
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
y y
1 −2 −1 1 −2 1
fY ( y ) = e + e , 0< y<∞
2π 2 y 2π 2 y
1 y
1 − −
= y 2e 2 , 0 < y < ∞
2π
hY ( y ), y ∈ RY 2
39
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Solution:
Let Z =
( X − µ ) , where the random variable Z has the standard normal
σ
distribution
z2
1 −
f Z (z ) = e 2
, −∞ < z <∞.
2π
We shall now find the distribution of the random variable Y = Z 2 . The
inverse solution of y = z 2 are z = ± y . If we designate z1 = − y and
1 1
z2 = y , then J 1 = − and J 2 = . Hence we have
2 y 2 y
y y
1 − 2 −1 1 −2 1
gY ( y ) = e + e
2π 2 y 2π 2 y
1 y
1 −1 −
= 1
y2 e 2 , y > 0.
2 2
π
Since g Y ( y ) is a density function, it follows that
1 1
1 y Γ 1 y Γ
e dy = ∫ e dy = ,
1 ∞ −1 − 2 ∞ 1 −1 − 2
1= 1 ∫
0
y 2 2
π 0
1
1
y 2 2
π
2 2
π 2 Γ
2
2
the integral being the area under a gamma probability curve with
1 1
parameters α = and β = 2 . Therefore, π = Γ and the probability
2 2
distribution of Y is given by
1 1
−1 −
y
1 y 2
e 2
, y > 0,
2 1
g Y ( y ) = 2 Γ
2
0, elsewhere,
which is seen to be a chi-squared distribution with 1 degree of freedom.
40
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Solution:
3
µ X = E ( X ) = ∑ x ⋅ 161 (2 x + 1)
x =0
1 3 5 7
= 0⋅ + 1⋅ + 2 ⋅ + 3 ⋅
16 16 16 16
17
=
8
41
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example(Continuous case)
The probability density function of the continuous random variable X is
1
f X ( x ) = 12
(
1 + x2 , ) 0 < x < 3,
0, elsewhere.
Find the mean of X.
Solution:
3
(1 + x 2 )
µ X = E(X ) = ∫ x dx
0
12
3
1
(
= ∫ x + x 3 dx
12 0
)
3
1 x2 x4
= +
12 2 4 0
33
= .
16
42
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Suppose in a computer game competition, the probabilities for Ali to
score 10, 20 and 30 points are given by 1/3, 1/5 and 7/15, respectively.
The probabilities for Ahmad to score 10, 20 and 30 points are given by
1/6, 1/3 and 1/2, respectively.
By using expected value, determine who is having a better skill in playing
the computer game?
Solution:
Let X be the points score by Ali and Y be the points score by Ahmad.
Then
E ( X ) = 10 × 13 + 20 × 15 + 30 × 157 = 643
E (Y ) = 10 × 16 + 20 × 13 + 30 × 12 = 703
We see that E (Y ) > E ( X ) , we may conclude that Ahmad is having a
better skill compare to Ali.
To find E ( X ) , we have
∑ X
xf ( x ) if X is discrete
E(X ) = x
∞ xf ( x )dx if X is continuous.
∫−∞ X
How about E [g ( X )] ?
∑
g (x) f X (x) if X is discrete
E [g ( X )] = x
∞ g ( x ) f ( x )dx
∫−∞ X if X is continuous.
43
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
The following results are true for both discrete and continuous of
ONE random variable:
1. E[aX + b] = aE[ X ] + b
2. E [g ( X ) ± h( X )] = E[g ( X )] ± E [h( X )]
Example(Discrete case)
Let X and Y be the random variables with joint probability distribution
function indicated as below:
x
fXY(x,y) 0 1 Row
total
0 1 1 3
2 4 4
y 1 1 1 1
8 8 4
Column 5 3 1
total 8 8
44
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
1 1
= ∑ x ∑ f XY ( x, y )
x =0 y = 0
1
= ∑ x[ f XY ( x,0 ) + f XY ( x,1)]
x =0
= (0 )[ f XY (0,0 ) + f XY (0,1)] + (1)[ f XY (1,0 ) + f XY (1,1)]
3
=
8
Example(Continuous case)
Let the joint probability density function be
x + y, 0 < x < 1;0 < y < 1
f XY ( x, y ) = . Find E(XY) and E(Y).
0, elsewhere
1
11 11
x 3 y x 2 y 2
1 1
y y2
E ( XY ) = ∫ ∫ xy ( x + y )dxdy = ∫ ∫ (2 2
)
x y + xy dxdy = ∫
3
+
2
dy = ∫ +
3 2
dy
00 00 0 0 0
3 1
y2 y 1
= + =
6 6 0 3
45
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
11 1
1
E (Y ) = ∫ ∫ yf XY (x, y )dxdy = ∫ y ∫ f XY (x, y )dx dy
00 0 0
1
1
= ∫ y ∫ (x + y )dx dy
0 0
1
1
x2
= ∫ y + xy dy
0
2 0
1
y
= ∫ + y 2 dy
0
2
1
y2 y3 7
= + =
4 3 0 12
The following results are true for both discrete and continuous of TWO
random variables, X and Y:
1. E [g ( X , Y ) ± h( X , Y )] = E [g ( X , Y )] ± E [h( X , Y )]
2. X and Y are independent ⇒ E[ XY ] = E[ X ]E[Y ] .
Example:
Let the joint probability density function be
x + y, 0 < x < 1;0 < y < 1
f XY ( x, y ) = . find E ( X + Y ) .
0, elsewhere
7
From previous example, E (Y ) = .
12
11 1
1
7
E ( X ) = ∫ ∫ xf XY ( x, y )dxdy = ∫ x ∫ ( x + y )dy dx = .
00 0 0 12
7 7 7
E ( X + Y ) = E ( X ) + E (Y ) = + = .
12 12 6
46
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Given two independent random variables with pdf
1, 0 < x1 < 1, 0 < x2 < 1,
f XY ( x, y ) =
0, otherwise.
Solution:
11
1
E ( X ) = ∫ ∫ xdxdy =
00
2
11
1
E (Y ) = ∫ ∫ ydxdy =
00
2
11
1
E ( XY ) = ∫ ∫ xydxdy =
00
4
1 1 1
We see that E ( X )E (Y ) = ⋅ = = E ( XY ) . Hence, E ( XY ) = E ( X )E (Y ) .
2 2 4
Important remark:
E [ XY ] ≠ E [ X ]E [Y ] ⇒ X and Y are dependent.
E [ XY ] = E [ X ]E [Y ] , DOES NOT IMPLY X and Y are independent.
47
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
µ µ
σ 2 small σ 2 large
∑ ( x − µ X )2 f X ( x ) if X is discrete
x
[
σ 2X = E ( X − µ X )
2
] =∞
∫ ( x − µ X )2 f X ( x )dx if X is continuous
−∞
Note:
σ X : standard deviation
48
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Solution:
3 3
[ 2
]
σ 2X = E ( X − µ X ) = ∑ ( x − µ X ) f X (x ) = ∑ (x − 178 )
2 2 1
16
(2 x + 1)
x =0 x =0
1 17 55
2 2 2 2
9 1 7
= − (1) + − (3) + − (5) + (7 ) =
16 8 8 8 8 64
Example(Continuous case):
The probability density function of the continuous random variable X is
1
f X ( x ) = 12
(
1 + x2 , )
0 < x < 3,
0, elsewhere.
Find the variance of X.
Solution:
∞
σ 2X [
= E (X − µ X ) =
2
] ∫ (x − µ X )
2
f X (x )dx
−∞
2
333 1 + x 2
= ∫ x − dx
0
16 12
699
=
1280
49
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Consider the following pmf:
1
f X (x ) = , x = −2,−1,0,1,2 and
5
1
f Y ( y ) = , y = −4,−2,0,2,4 .
5
1 4 2 2 4
E (Y ) = ∑ y⋅ =− − +0+ + =0
5 5 5 5 5
y = −4 , −2, 0, 2, 4
We may calculate var( X ) and var(Y ) as follows:
2 1 2 1 2 1 2 1 2 1
var( X ) = ∑ x 2 f X (x ) = (− 2) 5 + (− 1) 5 + (0 ) 5 + (1) 5 + (2 ) 5
x = −2, −1, 0 ,1, 2
=2
=8
50
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Symbol: cov( X , Y ) or σ XY
OR
σ XY = E ( XY ) − µ X µ Y
Example:
Let X and Y be the random variables with joint probability distribution
function indicated as below:
x
fXY(x,y) 0 1 Row
total
0 1 1 3
2 4 4
y 1 1 1 1
8 8 4
Column 5 3 1
total 8 8
Find cov( X , Y ) .
51
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Solution 1:
3
From previous example, we see that µ X = E ( X ) = .
8
1 1
µY = E (Y ) = ∑∑ yf XY ( x, y )
y =0 x =0
1
1 1
= ∑ y ∑ f XY ( x, y ) = ∑ y[ f XY (0, y ) + f XY (1, y )]
y =0 x = 0 y =0
1
= (0 )[ f XY (0,1) + f XY (1,0 )] + (1)[ f XY (0,1) + f XY (1,1)] = .
4
1 1
3 1
cov( X , Y ) = σ XY = E [( X − µ X )(Y − µY )] = ∑ ∑ x − y − f XY ( x, y )
x =0 y = 0 8 4
1
3 1 3 1
= ∑ x − 0 − f XY (x,0 ) + x − 1 − f XY ( x, 1)
x =0 8 4 8 4
1
= .
32
Solution 2:
From previous examples, we see that
3 1 1
µ X = E ( X ) = , µY = E (Y ) = and E ( XY ) = .
8 4 8
cov( X , Y ) = E ( XY ) − µ X µY
1 3 1
= − ⋅
8 8 4
1
=
32
52
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Given two independent random variables with pdf
1, 0 < x < 1, 0 < y < 1,
f XY ( x, y ) =
0, otherwise.
Show that cov( X , Y ) = 0 .
Solution:
11 11 11
1 1 1
E ( X ) = ∫ ∫ xdxdy = , E (Y ) = ∫ ∫ ydxdy = , E ( XY ) = ∫ ∫ xydxdy =
00
2 00
2 00
4
cov( X , Y ) = E ( XY ) − µ X µY
1 1 1
= − ⋅
4 2 2
=0
Example:
Let X and Y have the joint pmf
1
f XY ( x, y ) = , (x, y ) = (0, 1), (1, 0 ), (2, 1).
3
(i) Determine whether X and Y are independent.
(ii) Find cov( X , Y ).
Solution:
1 1 1
(i) f X (0 ) = , f X (1) = , f X (2 ) = .
3 3 3
1 2
f Y (0 ) = , f Y (1) =
3 3
1
We see that f XY (0,1) = which is not equal to
3
1 2 2
f X (0 ) f Y (1) = × = .
3 3 9
Thus, X and Y are dependent.
53
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
2 2 2 2 2
σ aX +bY = a σ X + b σ Y + 2abσ XY
Proof:
From definition,
2
σ aX {
+ bY = E [ (aX + bY ) − µ aX +bY ] 2}
Now,
µ aX +bY = E (aX + bY ) = aE ( X ) + bE (Y ) = aµ X + bµY
Therefore,
σ 2aX +bY = E {[ (aX + bY ) − (aµ X + bµ Y ) ] 2 }
=E {[ a( X − µ ) + b(Y − µ ) ] }
X Y
2
= a E [ ( X − µ ) ] + b E [ (Y − µ ) ] + 2abE [ ( X − µ )(Y − µY ) ]
2 2 2 2
X Y X
= a 2 σ 2X + 2 2
b σY + 2abσ XY
2
σ aX −bY = ?
54
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Moment
- Useful information to the shape and spread of the distribution
function.
- Used to construct estimators for population parameters via the so-
called method of moment.
55
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
First moment about the origin (E ( X )) gives the mean value which is a
measurement to describe central tendency.
Second moment about the mean tells about the dispersion of pdf (the
spread of random variables).
Fourth moment about the mean has been used as a measure of kurtosis
and peakedness.
µ1 =
N +1
and µ 2 =
( N + 1)(2 N + 1) .
2 6
Solution:
The well-known formulas for the sums of powers of the first N integers
are as follows.
N ( N + 1) N ( N + 1)(2 N + 1)
∑ x = 2 and ∑ x 2 = 6
1≤ x ≤ N 1≤ x ≤ N
Thus, µ1 = ∑ xf X ( x ) . Similarly, µ 2 = ∑x
1≤ x ≤ N
2
f X (x )
1≤ x ≤ N
∑x ∑ x2
= 1≤ x≤ N = 1≤ x ≤ N
N N
N ( N + 1) N ( N + 1)(2 N + 1)
= =
2N 6N
=
N +1
=
(N + 1)(2 N + 1)
2 6
56
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
γ2 =
[
E (X − µ X )
− 3.
4
]
σ 4X
For certain pdf’s, γ 2 is a useful measure of peakedness; relatively “flat”
pdf’s are said to be platykurtic; more peaked pdf’s are called leptokurtic.
57
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Correlation Coefficient
cov( X , Y )
The correlation coefficient of X and Y, is given by ρ XY = .
σ X σY
If ρ XY = 0 , then random variables X and Y are said to be uncorrelated.
Remarks:
For any two random variables X and Y,
(a) the correlation coefficient satisfies ρ XY ≤ 1 .
(b) there is an exact linear dependency (Y = aX + b ) when
(i) ρ XY = 1 if a > 0 or
(ii) ρ XY = −1 if a < 0 .
1
f ( x, y ) = , (x, y ) = (0, 1), (1, 0), (2, 1).
3
Uncorrelated ≠ Independent
58
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Independent variable
Dependent variable
Number of 80
products 70
60
50
40
30
20
10
0
0 1 2 3 4 5 6
Number of hours
Figure 6.15.1 Functional Relation between number of products and number of hours
59
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
The observations for a statistical relation do not fall directly on the curve
of relationship.
A diagram is plotted (Figure 6.15.2) based on the data in Table 6.15.1. The percent
reduction in chemical oxygen demand is taken as the dependent variable or response,
y, and the percent reduction in total solids as the independent variable or regressor, x.
Figure 6.15.2 is called a scatter diagram. In statistical terminology, each point in the
scatter diagram represents a trial or a case. Note that most of the points do not fall
directly on the line of statistical relationship (which do not have the exactitude of a
functional relation) but it can be highly useful.
60
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
60
50
40
30
20
10
0
0 10 20 30 40 50 60 x
Figure 6.15.2 Statistical Relation between Solids Reduction(%) and
Chemical Oxygen Demand(%)
y i = α + βxi + ε i
where
i. α and β are unknown intercept and slope parameters respectively.
ii. y i is the value of the response variable in the ith trial.
iii. xi is a known constant, namely, the value of the independent variable in the ith
trial.
iv. ε i is a random error with E (ε i ) = 0 and var (ε i ) = σ 2 . The quantity σ 2 is
often called the error variance or residual variance.
61
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
ŷ i = c1 + c 2 xi
where
i. c1 and c 2 are estimated values for α and β (unknown parameters, so-called
regression coefficient), respectively.
ii. ŷ i is the predicted or fitted value.
- We expect to have a fitted line which is close to the true regression line.
- In order to find “good” estimators of regression coefficients α and β , the method of
least squares is used.
Method of least squares: To minimize the sum of the squares of the residual (sum of
squares of the error about the regression line, SSE), we see that
n n n
SSE = ∑ ei2 = ∑ ( y i − ŷ i ) = ∑ ( y i − c1 − c 2 xi )
2 2
i =1 i =1 i =1
n
∂SSE
= −2∑ ( y i − c1 − c 2 xi )xi
∂c 2 i =1
Setting the partial derivations equal to zero, we obtain the following equations:
n n
∑y
i =1
i = nc1 + c 2 ∑ xi
i =1
n n n
∑x y
i =1
i i = c1 ∑ xi + c 2 ∑ xi2
i =1 i =1
n n
The above equations are called normal equations. The quantities ∑x , ∑y
i =1
i
i =1
i ,
n n
∑x y
i =1
i i and ∑x
i =1
2
i can be calculated from relevant data. Solve the normal equations
simultaneously, we have
62
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
n
n n n
n∑ xi y i − ∑ xi ∑ y i ∑ (x i − x )( y i − y )
c 2 = i =1 i =1 i =1 = i =1
2 n
n
∑ (x
n
− x)
2
n∑ xi2 − ∑ xi i
i =1 i =1 i =1
and
n n
∑y
i =1
i − c 2 ∑ xi
i =1
c1 = = y − c2 x
n
63
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Stochastic Processes
A collection of random variables {X (t ),t ∈ T } defined on a given
probability space, indexed by the time parameter t where t is in index set
T.
For example, the price of a particular stock counter listed on the stock
exchange as a function of time is a stochastic process.
In general,
X n = X n−1 + Z n , with Z n = 1,−1.
1 1
P(Z n = 1) = , P(Z n = −1) = .
2 2
X n = X 0 + Z 1 + Z 2 + ... + Z n and X 0 = 0 .
2 x
1 x x
x x
0 1 2 3 4 5
n
x
-1
64
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
State space
State space contains all the possible values of X (t ) .
Symbol is S.
In the stock counter example, the state space is the set of all prices of that
particular counter throughout the day.
Index Parameter
Index parameter normally refers to time parameter t.
X (t ) =
0 if t th toss is tail.
State space, S = {0, 1}. This is the stochastic process with discrete time
and discrete state space.
65
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
State space
Time parameter Discrete Continuous
Discrete Discrete time stochastic Discrete time stochastic
chain/process with a process with a
discrete state space continuous state space
Continuous Continuous time Continuous time
stochastic chain/process stochastic process with a
with a discrete state continuous state space
space
Common Examples
A game which moves are determined entirely by dice such as snakes and
ladders, monopoly is characterized by a discrete time stochastic process
with discrete state space.
66
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
3.18
X (t )
3.15
3.10
67
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Realization
Assignment to each t of a possible value of X(t)
2
t
0 t1 t2 t3
68
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
time
state
0 1 2 . . . State space
0 P00 P01 P02 . . .
1 P10 P11 .
2 P20 . .
P=
. . . .
. . . .
. . . . . . .
In this matrix,
(i) What is the probability from 10? One step transition
Answer: P10 probabilities
69
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
P(Xn+1=j | Xn = i) = P(X1=j | X0 = i)
70
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
1 2 3
0.2 1 0. 5 1
Is T =
0 0.5 2
a transition matrix?
0. 3
0.5 0 0 3
Yes. The way to read the transition probability for this type of matrix is
from ‘horizontal’ to ‘vertical’
1 2 3 Current state
0. 2 1 0.5 1
T = . For example, P11 = 0.2 , P21 = 1 , P31 = 0.5 .
0.3 0 0.5 2
0.5 0 0 3
Next state
1 2 3
1 0 .2 0 .3 0 .5
Once we transpose the matrix, we have P = T T =
0
.
2 1 0
3 0.5 0.5 0
This is the form that we use throughout the lecture notes, the way to read
this type of matrix is from ‘vertical’ to ‘horizontal’.
Remarks:
1. To verify whether a given matrix is a transition matrix, either
summation of a row equals 1 or summation of a column equals 1,
depending on the form of the matrix given.
2. In lecture notes, the way to read the transition matrix is from
‘vertical’ to ‘horizontal’.
71
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
72
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Given a transition matrix as below, draw the state transition diagram.
1 2
1 0.90 0.10
2 0.20 0.80
P12 = 0.10
P11 = 0.90 P22 = 0.80
State transition
1 2 diagram
P21 = 0.20
Example:
Given a transition matrix P with state space S = {1, 2, 3, 4} as follows:
1 2 3 4
1 0.7 a 0 0
0 .
P = 2 c 1− a a − b
3 0 0 1 d
4 0 0 0.2 1 − b
Solution:
(a) (b)
0.7 + a = 1 ⇒ a = 0.3
c +1− a + a − b =1 ⇒ c = b
1 2 3 4
1+ d =1⇒ d = 0
0. 2 + 1 − b = 1 ⇒ b = 0. 2
Thus, a = 0.3, b = 0.2, c = 0.2, d = 0
73
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example
A connection between two communication nodes is modeled by a discrete
time Markov chain. The connection is in any of the following three states.
State 0 – No connection
State 1 – Slow connection
State 2 – Fast connection
For this process, the transition probability matrix and state transition
diagram is given as below:
0.5
0 1 2
0.4
0 0.7 0.2 0.1 0.7
0 1
P = 1 0.5 0.4 0.1
0.2
0.25
0.25
74
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
1 2
1 0.80 0.20
2 0.10 0.90
i??j
1. Chapman-Kolmogorov Equations:
Pij(2 ) = ∑ Pik(1) Pkj(1) where Pik(1) = Pik and Pkj(1) = Pkj
k∈S
75
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Given a transition probability matrix with state space {1, 2} as below:
Solution:
Method 1:
From Chapman-Kolmogorov Equations we have
2
P12(2 ) = ∑ P1(k1) Pk(12)
k =1
= P11 P12 + P12 P22
= (0.90)(0.10) + (0.10)(0.80)
= 0.17
Method 2:
0.90 0.10 0.90 0.10
P (2 ) = ×
0.20 0.80 0.20 0.80
0.83 0.17
=
0.34 0.66
P12(2 ) = 0.17
76
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
P21(2 ) = 0.17
(b)
0.80 0.20 0.66 0.34
P (3 ) = PP 2 =
0.10 0.90 0.17 0.83
0.562 ×
=
× ×
∴ P11(3 ) = 0.562 .
77
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Given a one-step transition matrix P as below:
0 1 2 3
0 0 1 0 0
1 0.2 0 0.8 0
P= .
2 0 0.3 0.3 0.4
3 0 0 1 0
Solution:
We want to find P ( X 2 = 1 X 0 = 2) = P21(2 )
2-step
0.2 0 0.8 0
0 transition
P (2 ) = P × P = 0.44 0.24 0.32
matrix
0.06 0.09 0.73 0.12
0 0.3 0.3 0.4
Remark:
If no movement then the process will stay in the beginning state,
1 i= j
Pij(0 ) = . so the probability involved equals 1.
0 i≠ j If no movement then it is impossible for the process to go from
one state to another state, so the probability involved equals 0.
State Probabilities
Symbol: p j (k ) = P[ X k = j ]
78
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Method 1:
∞ ∞
P[ X k +1 = j ] = ∑ P[ X k = i ]P[X k +1 = j X k = i ] = ∑ pi (k )Pij .
i =0 i =0
Method 2:
By using iteration formula (which involves state probability vector, will
be discussed later)
79
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Solution:
(i) p0 (1) = P( X 1 = 0)
= ∑ P( X 1 = 0 X 0 = i )P( X 0 = i )
i
= P( X 1 = 0 X 0 = 0 )P( X 0 = 0 )
=0
p1 (1) = P( X 1 = 1)
= ∑ P ( X 1 = 1 X 0 = i )P( X 0 = i )
i
= P( X 1 = 1 X 0 = 0)P( X 0 = 0)
= 0. 5
(ii) p0 (2) = P( X 2 = 0 )
= ∑ P( X 2 = 0 X 1 = i )P( X 1 = i )
i
= 0.75 × 0.5 + 0.75 × 0.5
= 0.75
p1 (2 ) = P( X 2 = 1) = 0.125
p2 (2 ) = P( X 2 = 2 ) = 0.125
80
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
For example,
p0 (n ) = P ( X n = 0 ), which is the state probability in state 0 at time n.
p1 (n ) = P( X n = 1), which is the state probability in state 1 at time n.
.
.
.
pk (n ) = P( X n = k ), which is the state probability in state k at time n.
Property:
k
∑ p j (n ) = 1, and each element p j (n ) is nonnegative.
j =0
Method 2:
By n iterations with the one-step transition matrix
p (n ) = p (n − 1)P
81
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Hence, λ = 1, 1−(p+q)
For λ = 1−(p+q)
q p x 0
q p y = 0
qx+py=0
p
Choose x = p, we have v2=
− q
82
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
1 p 1 0 1 q p
P= ( P = QDQ −1 )
1 − q 0 1 − ( p + q) p + q 1 − 1
And hence,
( you can find P n easily because P n = QD n Q −1 )
1 p 1 0 1 q p
Pn = n
1 − q 0 {1 − ( p + q)} p + q 1 − 1
1 1 p{1 − ( p + q)}n q p
=
p + q 1 − q{1 − ( p + q)}n 1 − 1
1 q + p{1 − ( p + q )}n p − p{1 − ( p + q)}n
=
p + q q − q{1 − ( p + q)}n p + q{1 − ( p + q)}n
=
1
p+q
[
p0 q + p0 pλn + p1q − p1qλn p0 p − p0 pλn + p1 p + p1qλn ]
1 λn
= [( p0 + p1 )q ( p0 + p1 ) p] + [ p0 p − p1q − p0 p + p1q]
p+q p+q
1 λn
= [q p] + [ p0 p − p1q − p0 p + p1q]
p+q p+q
83
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
84
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example
Refer to earlier example on connection between two communication
nodes (pg. 12). The connection is in any of the following three states.
State 0 – No connection, State 1 – Slow connection, State 2 – Fast
connection
For this process, the transition probability matrix is given as below:
0.7 0.2 0.1
P = 0.5 0.4 0.1
0.5 0.25 0.25
Assume initially the connection is at full speed: p(0) = (0 0 1)
Then the probabilities of each type of connection after increasing number
of transitions are:
0.7 0.2 0.1 0.7 0.2 0.1
p (1) = p (0 ) 0.5 0.4 0.1 = (0 0 1) 0.5 0.4 0.1
0.5 0.25 0.25 0.5 0.25 0.25
= (0.5 0.25 0.25)
0.7 0.2 0.1 0.7 0.2 0.1
p (2 ) = p (1) 0.5 0.4 0.1 = (0.5 0.25 0.25) 0.5 0.4 0.1
0.5 0.25 0.25 0.5 0.25 0.25
= (0.6 0.2625 0.1375)
p (3) = p (2 )P
= (0.62 0.2594 0.1206 )
p (4 ) = p (3)) P
= (0.6240 0.2579 0.1181)
p (5) = (0.6248 0.2575 0.1177 )
p (6) = (0.6250 0.2574 0.1177)
p (7 ) = (0.6250 0.2574 0.1176 )
p (8) = (0.6250 0.2574 0.1176)
85
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
π j = lim p j (n ) = lim P[ X n = j ]
n→∞ n→∞
Example:
Consider a transition matrix as follows:
0 1
0 0.8 0.2
1 0.1 0.9
What is the limiting state (stationary) probability vector [π 0 π1 ] ?
Solution:
1 − p p
Compare the transition matrix with P = . We see that p = 0.2
q 1 − q
and q = 0.1 .
First, we may use the following formula to find state probabilities at time
n:
q p p p − p1q − p 0 p + p1q
p(n ) = [ p0 (n ) p1 (n )] = + λn2 0 where
p+q p + q p+q p + q
λ2 = 1 − ( p + q) .
1 2 2 1 −2 1
p(n ) = [ p0 (n ) p1 (n )] = + λn2 p0 − p1 p0 + p1
3 3 3 3 3 3
86
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Communication
87
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
1 2 3 4
Solution:
C1 = {1}
C2 = {2, 3}
C3 = {4}
88
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Given a transition probability matrix with state space S = {1, 2, 3} as
shown below:
1 2 3
1 0. 5 0. 5 0
2 0.7 0 0.3
P=
3 0.1 0.9 0
Solution:
There is only one class (all states
1 2 3 communicate with each other), the
Markov chain is said to be
irreducible.
C = {1, 2, 3}
Example:
Given a Markov chain with state space, S = {0, 1, 2, 3, 4, 5} and transition
probability matrix as follows:
1 2 3 4 5
1 0.4 0.6 0 0 0
2 0.5 0.5 0 0 0
P = 3 0 0 0 1 0 .
4 0 0 0.8 0 0.2
5 0 0 0 1 0
Decompose the state space, S into equivalence classes.
Solution:
1 2 3 4 5
89
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Periodicity
Symbol: d(i) [denotes the period of state i]
{ }
d (i ) = g.c.d n Pii( n ) > 0 .
(g.c.d is the largest integer that divides all the {n} exactly).
i i
- n is the number of steps from i to i.
- In between state i, the process MAY or MAY NOT go back to
state i.
Example:
Given a Markov Chain with transition matrix:
1 2 3 4
1 0 1 0 0
P = 2 0 0 1 0 . Find d (i ) , i = 1, 2, 3, 4.
3 0 0 0 1
4 1 0 0 0
Solution:
{
d (1) = g.c.d n P11(n ) > 0 }
= g.c.d {4,8,12,⋯}
=4
Similarly, d (2 ) = d (3) = d (4 ) = 4 .
90
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Some remarks:
(i) If Pii(n ) = 0 ∀ n ≥ 1 , define d(i) = 0.
(ii) If Pii(n ) > 0 ∀ n ≥ 1 at n=s, s+1 then d(i) = 1.
(iii) If i ↔ j , then d(i) = d(j).
(iv) If d (i ) = 1 , then i is said to be aperiodic.
(v) If d (i ) ≥ 2 , then i is said to be periodic.
Example:
Find the period of all the states:
1 2 3 4
1 0 0.3 0 0.7
2 1 0 0 0
3 0 0 0 1
4 0.2 0 0.8 0
Solution:
First, we determine number of classes:
1 2 3 4
91
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Can you see the relations between stationary probability vector and
limiting state probability?
π = πP and ∑ π j = 1.
j ∈S
The above formula can also be used for an irreducible, recurrent,
periodic, finite Markov chain.
Example:
Consider a transition matrix as follows:
0 1
0 0.8 0.2
1 0.1 0.9
What is the limiting state (stationary) probability vector [π 0 π1 ] ?
Solution
The above Markov chain fulfills the conditions of
(iv) aperiodic,
(v) irreducible and
(vi) finite Markov chain.
π 0 = 0.8π 0 + 0.1π1
π1 = 0.2π 0 + 0.9π1
π 0 + π1 = 1
92
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example
Refer to earlier example on connection between two communication
nodes. The connection is in any of the following three states.
State 0 – No connection,
State 1 – Slow connection
State 2 – Fast connection
For this process, the transition probability matrix is given as below:
0.7 0.2 0.1
P = 0.5 0.4 0.1
0.5 0.25 0.25
Then the probabilities of each type of connection after long runs are:
π = πP
0.7 0.2 0.1
(π 0 π 1 π 2 ) = (π 0 π 1 π 2 ) 0.5 0.4 0.1
0.5 0.25 0.25
Equation from the first column
π 0 = 0.7π 0 + 0.5π 1 + 0.5π 2 → 0.3π 0 − 0.5π 1 − 0.5π 2 = 0
Equation from the second column
π 1 = 0.2π 0 + 0.4π 1 + 0.25π 2 → 0.2π 0 − 0.6π 1 + 0.25π 2 = 0
Plus the standard equation
π 0 + π1 + π 2 = 1
Forms a 3 × 3 matrix equation
−1
3 − 5 − 5 π 0 0 π 0 3 − 5 − 5 0
4 − 12 5 π 1 = 0 → π 1 = 4 − 12 5 0
1 1 1 π 2 1 π 2 1 1 1 1
π 0 − 85
1 5 35 2
π1 = − 35 , π 0 = = 0.625 , π 1 = = 0.2574 , π 2 = = 0.1176
π − 136 − 16 8 136 17
2
Compare these probabilities with p(8) in earlier example.
93
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
What is f ii( n ) ?
f ii( n ) is the probability that, starting from state i, the first return to state i
occur at the nth transition.
i i i i
94
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
If the process starts from state i, after some time it can still go back to
state i, then we said that state i is a recurrent state.
Those states, which are not recurrent, we said they are transient.
In other words, a state i is transient if there is a way to leave state i that
never returns to state i.
Method 1
Draw and check the state transition diagram.
Method 2
Specify the classes and determine whether they are a closed set or not.
A closed set is a recurrent set.
A set of states S in a Markov Chain is a closed set if no state outside of S
is accessible from any state in S.
Method 3
∞
A state i is recurrent if and only if ∑ f ii(n ) = 1 .
n =1
∞
A state i is transient if and only if ∑ f ii(n ) < 1.
n =1
Method 4
∞
A state i is recurrent if and only if ∑ Pii(n ) = ∞ (diverge)
n =1
∞
A state i is transient if and only if ∑ Pii(n ) < ∞ (converge).
n =1
95
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Markov Chain with transition matrix:
1 2 3 4
0 0 1 0
1
1 0 0 0
2 1 1 and s ={1, 2, 3, 4}
P= 0 0
3 2 2
4 1 1 1 1
4 4 4 4
Solution:
(a)
1 2 3 4
96
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Ergodic
The most important case is that in which a class is both recurrent and
aperiodic. Such classes are called ergodic and a chain consisting entirely
of one ergodic class is called an ergodic chain. These chains have the
property that Pijn becomes independent of the starting state i as n →∞.
n
Theorem: Pij(n ) = ∑ f ij( k ) Pjj( n − k ) , n ≥ 1
k =0
97
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Example:
Given a transition matrix as below
1 2
1 0.90 0.10
2 0.20 0.80
Find f12(3) .
Solution:
f12( 2 ) = P12( 2 ) − f12(1) P22(1) 1 1 1 2
= 0.17 − (0.10)(0.80) 0.90 0.90 0.10
= 0.09
f12(3) = 0.90 × 0.90 × 0.10 = 0.081
f12(3) = P12(3) − f12(1) P22(2 ) − f12( 2 ) P22(1)
= 0.219 − (0.10)(0.66) − (0.09)(0.80)
= 0.081
∞
When ∑ f ij(n ) equals 1, f ij( n ) can be considered as a probability
n =1
distribution for the random variable, the first passage time.
∞ if ∑ f ij( n ) < 1
n =1
µ ij = ∞ ∞
nf ( n )
∑ ij if ∑ f ij( n ) = 1
n =1 n =1
∞
Whenever ∑ f ( ) = 1 , the µij satisfies uniquely the equation
n =1
ij
n
µ ij = 1 + ∑ Pik µ kj .
k≠ j
98
EEM 2046 Engineering Mathematics IV Random Variables and Stochastic Processes
Solution:
Solution:
Let state 1: battery 1 is purchased,
state 2: battery 2 is purchased.
~END~
99