Sei sulla pagina 1di 82

2 TWO - DIMENSIONAL RANDOM VARIABLES

Joint distributions - Marginal and conditional distributions - Covariance -


Correlation and Linear regression - Transformation of random variables.

2.1 Introduction

In the previous chapters, we discussed about only one dimensional


random variables. i.e., X(s). But in this chapter, we will extend our
discuss about two or more random variables. i.e., X(s), Y(s), · · · .

Examples:

1. To find the number of heads(X) and number of tails(Y)


simultaneously by tossing coins.

2. To measure the pressure(P) and volume(V) of a water released from


a dam, which gives to a 2 dimensional sample space with outcomes
as (P, V).

Two - Dimensional Random Variable

Let ‘S ’ be a sample space. Let X = x(S ) and Y = y(S ) be two functions


each assigning a real number to each outcome s ∈ S then, (X, Y) is a two
dimensional random variable.

2.2 Joint and marginal distributions

Classification of Joint and Marginal distribtuions

Joint distributions Marginal distributions


X(s) and Y(s) are d.r.v. Joint P.M.F. Marginal P.M.F.
X(s) and Y(s) are c.r.v. Joint P.D.F. Marginal P.D.F.
140 MA6453 Probability and Queueing Theory

2.3 Two - Dimensional Discrete Random Variable

If the possible values of (X, Y) are finite or countably infinite, then (X, Y)
is called as 2 - dimensional discrete
 random variable.
It can be represented as Xi , Y j where i = 1, 2, 3, · · · n; j = 1, 2, 3, · · · m.

2.3.1 Joint Probability Mass Function of 2D Discrete random variables

Let X and Y be two dimensional discrete random variable then P(X =


xi , Y = y j ) = P(Xi , Y j ) = Pi j is called joint probability mass function and
satisfy the conditions.
 
(i)P Xi , Y j ≥ 0, ∀i, j.
Xn X m  
(ii) P Xi , Y j = 1
i=1 j=1

2.3.2 Marginal probability mass function of Discrete r.v. X

If the joint probability distribution of 2 - dimensional random variable


(X, Y) is given, then the marginal probability function of X is given by

P(X = Xi ) = P(X = Xi , Y = Y1 ) + P(X = Xi , Y = Y2 ) + · · · + P(X = Xi , Y = Ym )


= Pi1 + Pi2 + Pi3 + · · · + Pim
Xm
= Pi j , (i fixed, j varies)
j=1

P(X = Xi ) = Pi∗

The set {Xi , Pi∗ } is called the marginal distribution of X.

2.3.3 Marginal probability mass function of Discrete r.v. Y

If the joint probability distribution of 2 - dimensional random variable


(X, Y) is given, then the marginal probability function of Y is given by,

P(Y = Y j ) = P(X = X1 , Y = Y j ) + P(X = X2 , Y = Y j ) + P(X = Xn , Y = Y j )


= P1 j + P2 j + P3 j + · · · + Pn j
Xn
= Pi j , (i varies, j fixed)
i=1
P(Y = Y j ) = P∗ j
n o
The set Y j , P∗ j is called the marginal distribution of Y.
Two - Dimensional Random Variables 141

2.3.4 Conditional Probability

Let X and Y be 2-dimensional discrete random variable with probability


mass function P(Xi , Y j ) then conditional probability function of X given Y
is given by

P(X = Xi , Y = Y j )
P(X = Xi /Y = Y j ) =
P(Y = Y j )
Pi j
=
P∗ j

The conditional probability of Y given X is given by

P(X = xi , Y = y j )
P(Y = y j /X = xi ) =
P(X = xi )
Pi j
=
Pi∗

Joint cumulative distribution function of 2D continuous r.v.

For the two-dimensional continuous random variable (X, Y), the


cumulative distribution function is

F(x, y) = P[X ≤ x, Y ≤ y]
XX  
= P X = xi , Y = y j
i j

2.3.5 Independence of pairs of 2D Discrete random variables

The two random variable X and Y are said to be independent if

P(xi , y j ) = P(xi ) · P(y j )


i.e, Pi j = Pi∗ · P∗ j

2.3.6 Examples of Joint, Marginal, conditional and Independence of 2D Discrete r.v.

Recall the formulas to do problems on two dimensional discrete random


variables.
142 MA6453 Probability and Queueing Theory

m
n P  
P xi , y j = 1. [Use it for Checking of JPMF or to find an unknown in JPMF]
P
1.
i=1 j=1
m
2. P(X = xi ) =
P
Pi j [Use it to find MPMF of X]
j=1
n
3. P(Y = Y j ) =
P
Pi j [Use it to find MPMF of Y]
i=1

4. F(x) = P(X ≤ x). [Use it to find CDF of X]


5. F(y) = P(Y ≤ y). [Use it to find CDF of Y]
6. F(x, y) = P(X ≤ x, Y ≤ y). [Use it for JCDF of X and Y]
7. E(X) = xi P (X = xi )
P
[Use it to find Mean or Expectation of X]
i
 
8. E(Y) = y jP Y = y j
P
[Use it to find Mean of Y]
j
 
9. E(XY) = xi y j P X = xi , Y = y j
PP
[Use it to find Mean of X and Y]
i j

10. P(xi , y j ) = P(xi ) · P(y j ) [Use it for checking independent of r.v.s X and
Y]
P(X = Xi , Y = Y j )
11. P(X = Xi /Y = Y j ) = [Use it for the conditional
P(Y = Y j )
probability of X given Y]
P(X = xi , Y = y j )
12. P(Y = y j /X = xi ) = [Use it for the conditional
P(X = xi )
probability of Y given X]

Example 2.1. Two coins are tossed. X denotes number of heads


and Y denotes number of tails. Find
(a) Joint p.m.f. (b) Marginal p.m.f. of (X, Y)
(c) P(X ≤ 1, Y = 1) (d) P(Y ≤ 1)
(e) F X (1) (f) FY (2)
(g) F(2, 1) (h) Are X and Y independent?
(i) Conditional probability of (j) Conditional probability of
X given Y = 1 Y given X = 2

Solution : W.K.T. by tossing two coins,


The sample space is S = {HH, HT, T H, T T }, n(S ) = 4.
Given X a discrete r.v. denotes the number of heads which takes the
values 2,1,1,0.
Two - Dimensional Random Variables 143

∴ X = {0, 1, 2}
Given Y a discrete r.v. denotes the number of tails which takes the
values 0,1,1,2.

∴ Y = {0, 1, 2}
(a) Joint P.M.F. :
X
0 1 2
0 P(X = 0, Y = 0) = 0 P(X = 1, Y = 0) = 0 P(X = 2, Y = 0) = 1/4
Y 1 P(X = 0, Y = 1) = 0 P(X = 1, Y = 1) = 2/4 P(X = 2, Y = 1) = 0
2 P(X = 0, Y = 2) = 1/4 P(X = 1, Y = 2) = 0 P(X = 2, Y = 2) = 0
(b) Marginal P.M.F. of X & Y :

X Mar. PMF of Y :
0 1 2 PY (y) = P∗ j
0 0 0 1/4 P(Y = 0) = 1/4
Y 1 0 2/4 0 P(Y = 1) = 2/4
2 1/4 0 0P(Y = 2) = 1/4
XX  
Mar. PMF of X : P(X = 0) P(X = 1) P(X = 2) P X = xi , Y = y j
∀i ∀j

PX (x) = Pi∗ = 1/4 = 2/4 = 1/4 =1

i.e.,
Marginal P.M.F. of X Marginal P.M.F. of Y
j=2 i=2
1 X   1 X
P(X = 0) = = P X = 0, Y = y j P(Y = 0) = = P (X = xi , Y = 0)
4 j=0 4 i=0

j=2 i=2
2 X   2 X
P(X = 1) = = P X = 1, Y = y j P(Y = 1) = = P (X = xi , Y = 1)
4 j=0 4 i=0

j=2 i=2
1 X   1 X
P(X = 2) = = P X = 2, Y = y j P(Y = 2) = = P (X = xi , Y = 2)
4 j=0 4 i=0
144 MA6453 Probability and Queueing Theory

(c) P(X ≤ 1, Y = 1) = P(X = {0, 1}, Y = 1)


= P(X = 0, Y = 1) + P(X = 1, Y = 1)
2 2
=0+ =
4 4

(d) P(Y ≤ 1) = P(Y = {0, 1})


= P(Y = 0) + P(Y = 1)
1 2 3
= + =
4 4 4

(e) F X (1) = F X (X = 1)
= P(X ≤ 1)
= P(X = 0, 1)
= P(X = 0) + P(X = 1)
1 2 3
= + =
4 4 4

(f) FY (2) = FY (Y = 2)
= P(Y ≤ 2)
= P(X = 0, 1, 2)
= P(X = 0) + P(X = 1) + P(X = 2)
1 2 12
= + + =1
4 4 4

(g) F(2, 1) = P(X ≤ 2, Y ≤ 1)


= P(X = 0, 1, 2; Y = 0, 1)
= P(0, 0) + P(0, 1) + P(1, 0) + P(1, 1) + P(2, 0) + P(2, 1)
2 1 3
=0+0+0+ + +0=
4 4 4
(h) Independence or dependence of X and Y:
   
Independent P X = xi , Y = y j = P (X = xi ) · P Y = y j , ∀i, j.
   
Not independent P X = xi , Y = y j , P = xi · P Y = y j , for any one i, j.
(X )
For X = 0, Y = 0:
Here P(X = 0, Y = 0) = 0.
1 1
P(X = 0) · P(Y = 0) = ·
4 4
1
=
16
, P(X = 0, Y = 0)
Two - Dimensional Random Variables 145

Hence X and Y are not independent.


(i) Conditional probability of X given Y = 1:
P(X = 0/Y = 1) P(X = 1/Y = 1) P(X = 2/Y = 1)
!
2
P(X = 0, Y = 1) 0 P(X = 1, Y = 1) 4 P(X = 2, Y = 1) 0
= ! =0 = ! =1 = ! =0
P(Y = 1) 2 P(Y = 1) 2 P(Y = 1) 2
4 4 4
(j) Conditional probability of Y given X = 2:
P(Y = 0/X = 2) P(Y = 1/X = 2) P(Y = 2/X = 2)

P(Y = 0, X = 2) P(Y = 1, X = 2) P(Y = 2, X = 2)


P(X = 2) P(X = 2) P(X = 2)
(or) (or) (or)
!
1
P(X = 2, Y = 0) 4 P(X = 2, Y = 1) 0 P(X = 2, Y = 2) 0
= ! =1 = ! =0 = ! =0
P(X = 2) 2 P(X = 2) 1 P(X = 2) 1
4 4 4
Example 2.2. A two dimensional random variable (X, Y) has the
joint p.m.f. P(x, y) = k(2x + 3y), x = 0, 1, 2; y = 1, 2, 3. Find
(a) Probability distribution of X + Y.
(b) P(X > Y)
(c) P[Max(X, Y) = 3]
(d) Probability distribution of Z = Min(X, Y).
(e) Conditional probability of X given Y = 3.
Solution : Given 2dimensional discrete r.v. X and Y with joint pmf as
P(x, y) = k(2x + 3y), x = 0, 1, 2; y = 1, 2, 3.
which contains an unknown k.
The joint pmf is
X
0 1 2
1 3k 5k 7k
Y 2 6k 8k 10k
3 9k 11k 13k
146 MA6453 Probability and Queueing Theory

To find k,

XX
P(X = x, Y = y) = 1
∀x ∀y

P(0,1)+P(0,2)+P(0,3)+P(1,1)+P(1,2)+P(1,3)+P(2,1)+P(2,2)+P(2,3) = 1
3k + 6k + 9k + 5k + 8k + 11k + 7k + 10k + 13k = 1
72k = 1
1
∴k=
72

X MPMF of Y :
0 1 2 PY (y) = P∗ j
1 3/72 5/72 7/72 P(Y = 1) = 15/72
Y 2 6/72 8/72 10/72 P(Y = 2) = 24/72
3 9/72 11/72 13/72 P(Y = 3) = 33/72
XX  
MPMF of X : P(X = 0) P(X = 1) P(X = 2) P X = xi , Y = y j
∀i ∀j

PX (x) = Pi∗ = 18/72 = 24/72 = 30/72 =1

Marginal p.m.f. of X, Y:

Marginal P.M.F. of X Marginal P.M.F. of Y


P(X = 0) = 18/72 P(Y = 1) = 15/72
P(X = 1) = 24/72 P(Y = 2) = 24/72
P(X = 2) = 30/72 P(Y = 3) = 33/72

(a) Prob. dist. of X + Y:


Two - Dimensional Random Variables 147

X + Y P(X+Y)
P(X + Y = 1) = P(X = 0, Y = 1)
1 3
=
72
P(X + Y = 2) = P(X = 0, Y = 2) + P(X = 1, Y = 1)
6 5
2 = +
72 72
11
=
72
P(X + Y = 3) = P(X = 0, Y = 3) + P(X = 1, Y = 2) + P(X = 2, Y = 1)
9 8 7
3 = + +
72 72 72
24
=
72
P(X + Y = 1) = P(X = 1, Y = 3) + P(X = 2, Y = 2)
11 10
4 = +
72 72
21
=
72
P(X + y = 5) = P(X = 2, Y = 3)
5 13
=
72

7
(b) P(X > Y) = P(X = 2, Y = 1) =
72
(c) P[Max(X, Y) = 3]:

P[Max(X, Y) = 3] = P(X = 0, Y = 3) + P(X = 1, Y = 3) + P(X = 2, Y = 3)


9 11 13
= + +
72 72 72
33
=
72

(d) P[Z = Min(X, Y)]:


148 MA6453 Probability and Queueing Theory

Z = Min(X, Y) P(Z)

P(Z = 0) = P(0, 1) + P(0, 2) + P(0, 3)


3 6 9
0 = + +
72 72 72
18
=
72
P(Z = 1) = P(1, 1) + P(1, 2) + P(1, 3) + P(2, 1)
5 8 11 7
1 = + + +
72 72 72 72
31
=
72
P(Z = 2) = P(2, 2) + P(2, 3)
10 13
2 = +
72 72
23
=
72

(e) Conditional probability of X given Y = 3:

P(X = 0/Y = 3) P(X = 1/Y = 3) P(X = 2/Y = 3)


! ! !
9 11 13
P(X = 0, Y = 3) 72 9 P(X = 1, Y = 3) 72 11 P(X = 2, Y = 3) 72 13
= != = != = !=
P(Y = 3) 33 33 P(Y = 3) 33 33 P(Y = 3) 33 33
72 72 72

Example 2.3. Three balls are drawn from a box in random


containing 3 red, 2 white and 4 black. If X denotes number of
white balls drawn and Y denotes number of red balls drawn.
Find joint probability distribution P(X, Y), Marginal distribution
and independence of X and Y.
Solution:

Given that a box contains 3 red, 2 white and 4 black(Total balls=9).

X = denotes the number of white balls


Y = denotes the number of red balls
RX = {0, 1, 2}
RY = {0, 1, 2, 3}

The joint probability distribution P(X, Y) :


Two - Dimensional Random Variables 149

Y
0 1 2 3
4C3 3C1 · 4C2 3C2 · 4C1 3C3
0
9C3 9C3 9C3 9C3
2C1 · 4C2 2C1 · 3C1 · 6C1 2C1 · 3C2 0
X 1
9C3 9C3 9C3 9C3
2C2 · 4C1 2C2 · 4C1 0 0
2
9C3 9C3 9C3 9C3

i.e.,
The joint and marginal probability distribution :

Y MPMF of Y
0 1 2 3 PX (x) = Pi∗
4 18 12 1 35
0 P(X = 0) =
84 84 84 84 84
12 24 6 42
X 1 0 P(X = 1) =
84 84 84 84
4 3 7
2 0 0 P(X = 2) =
84 84 84
XX
MPMF of X : P(Y = 0) P(Y = 1) P(Y = 2) P(Y = 3) P(i, j)
∀i ∀j

20 45 18 1
PY (y) = P∗ j = = = = =1
84 84 84 84

Independence of X and Y: Here


4
P(X = 0, Y = 0) = (1)
84
35 20 700
P(X = 0) · P(Y = 0) = · = (2)
84 84 7056
⇒ (1) , (2)

Hence X and Y are not independent.


Example 2.4. From the following table for bi-variate distribution
of X, Y, find
150 MA6453 Probability and Queueing Theory

(i) P(X ≤ 1) (iv) P(X ≤ 1/Y ≤ 3)


(ii) P(Y ≤ 3) (v) P(Y ≤ 3, X ≤ 1)
(iii) P(X ≤ 1, Y ≤ 3) (vi) P(X + Y ≤ 4)

Y
HH
HH
1 2 3 4 5 6
X H
HH
H

0 0 0 1/32 2/32 2/32 3/32


1 1/16 1/16 1/8 1/8 1/8 1/8
2 1/32 1/32 1/64 1/64 0 2/64

Solution: Here X and Y are discrete random variables.


The joint and marginal pmf table of X and Y is

HH
Y Mar. PMF of X
HH
1 2 3 4 5 6
X Pi∗ = P (X = xi )
H
HH
H

1 2 2 3 1
0 0 0
32 32 32 32 4
1 1 1 1 1 1 5
1
16 16 8 8 8 8 8
1 1 1 1 2 1
2 0
32 32 64 64 64 8
Mar. PMF of Y 3 3 11 13 3 1
HH
H 1
  HH
P∗ j = P Y = y j 32 32 64 64 16 4 1 H
H
H

(i) P(X ≤ 1) = P(X = 0) + P(X = 1)


1 5
= +
4 8
7
=
8

(ii) P(Y ≤ 3) = P(Y = 1) + P(Y = 3) + P(Y = 3)


3 3 11
= + +
32 32 64
23
=
64
Two - Dimensional Random Variables 151

(iii) P(X ≤ 1, Y ≤ 3) = P(0, 1) + P(0, 2) + P(0, 3) + P(1, 1) + P(1, 2) + P(1, 3)


9
=
32

P(X ≤ 1, Y ≤ 3)
(iv) P(X ≤ 1/Y ≤ 3) = (1)
P(Y ≤ 3)
9/32
=
P(Y = 1) + P(Y = 2) + P(Y = 3)
9/32 18
= =
23/64 23

P(Y ≤ 3, X ≤ 1)
(v) P(Y ≤ 3/X ≤ 1) =
P(X ≤ 1)
9/32 9 8 9
= = × =
7/8 32 7 28

(vi) P(X + Y ≤ 4)

X+Y P(X + Y)
1 0
2 0 + 1/16 = 1/16
3 1/32 + 1/16 + 1/32 = 4/32
4 1/8 + 2/32 + 1/32 = 7/32

P(X + Y ≤ 4) = P(0, 1) + P(0, 2) + P(0, 3) + P(0, 4) + P(1, 1) + P(1, 2)


+ P(1, 3) + P(2, 1) + P(2, 2)
1 2 1 1 1 1 1 13
=0+0+ + + + + + + =
32 32 16 16 8 32 32 32
Example 2.5. From the following joint distribution X, Y. Find
Marginal distribution of X and Y.

X
HH
HH
0 1 2
Y H
HH
H

3 9 3
0
28 28 28
3 3
1 0
14 14
1
2 0 0
28
152 MA6453 Probability and Queueing Theory

Solution:

X
HH
HH
0 1 2 P(Y = y j )
Y H
HH
H

3 9 3 15
0
28 28 28 28
3 3 12
1 0
14 14 28
1 1
2 0 0
28 28
10 15 3
P(X = xi ) 1
28 28 28

Marginal distribution of X.

X 0 1 2
10 15 3
P(X = xi )
28 28 28

Marginal distribution of Y.

Y 0 1 2
15 12 1
P(Y = y j )
28 28 28

Example 2.6. Find the marginal distribution of X and Y and also


P(X ≤ 1, Y ≤ 1), P(X ≤ 1), P(Y ≤ 1). Check whether X and Y are
independent. The joint probability mass function of X and Y is

Y
H
HH
H 0 1 2
X HH
H
H

0 0.10 0.04 0.02


1 0.08 0.20 0.06
2 0.06 0.14 0.30

Solution: Here X and Y are discrete random variables.


The joint and marginal pmf of X and Y is
Two - Dimensional Random Variables 153

Y
HH
HH
0 1 2 P(X = xi )
X H
HH
H

0 0.10 0.04 0.02 0.16


1 0.08 0.20 0.06 0.34
2 0.06 0.14 0.30 0.50
P(Y = y j ) 0.24 0.38 0.38 1

Marginal distribution of X.
X 0 1 2
P(X = xi ) 0.16 0.34 0.30
Marginal distribution of Y.
Y 0 1 2
P(Y = y j ) 0.24 0.38 0.38

P(X ≤ 1, Y ≤ 1) = P(0, 0) + P(0, 1) + P(1, 0) + P(1, 1)


= 0.10 + 0.04 + 0.08 + 0.20
= 0.42

P(X ≤ 1) = P(X = 0, 1)
= P(X = 0) + P(X = 1) = 0.16 + 0.34
= 0.50

P(Y ≤ 1) = P(Y = 0, 1)
= P(Y = 0) + P(Y = 1) = 0.24 + 0.38
= 0.62

W.K.T., if X and Y are independent, P(xi , yi ) = P(xi ).P(y j ), ∀i, j.


For X = 1, Y = 0.

P(X = 1, Y = 0) = 0.08 (1)


P(X = 1) = 0.34
P(Y = 0) = 0.24
P(X = 1) · P(Y = 0) = (0.34) · (0.24) = 0.0816 (2)

From (1) and (2), X and Y are not independent.


Example 2.7. Let X and Y have the following joint probability
distribution.
154 MA6453 Probability and Queueing Theory

X
HH
H
HH 2 4
Y H
H
H
1 0.10 0.15
3 0.20 0.30
5 0.10 0.15

Show that x and y are independent.


Solution:

X
HH
HH
2 4 P(Y = y j )
Y H
HH
H
1 0.10 0.15 0.25
3 0.20 0.30 0.5
5 0.10 0.15 0.25
P(X = xi ) 0.40 0.60 1

Here Pi j = P∗ j Pi∗ , ∀i, j


Hence X and Y are independent.

Example 2.8. X and Y are two random variables having the joint
1
pmf of (X, Y) is P (x, y) = (x + 2y), where X and Y assumes the
27
integer values 0,1 and 2. Find the marginal probability
distribution of X.
1
Solution: Given f (x, y) = (x + 2y) ; x = 0, 1, 2; y = 0, 1, 2.
27
Here X and Y are discrete random variables.
The joint and marginal pmf of X and Y is
 
X P Y = yj

0 1 2
0 0 1/27 2/27 3/27
Y 1 2/27 3/27 4/27 9/27
2 4/27 5/27 6/27 15/27
P (X = xi ) 6/27 9/27 12/27 1

Marginal distribution of X.

X 0 1 2
P(X = xi ) 6/27 9/27 12/27
Two - Dimensional Random Variables 155

2.4 Two - Dimensional Continuous Random Variable

2.4.1 Joint Probability Distribution of 2D Continuous random variables

If X and Y are continuous Random Variables then f (x, y) is called Joint


probability density function which satisfies

(i) f (x, y) ≥ 0.
Z∞ Z∞
(ii) f (x, y) dx dy = 1.
−∞ −∞

2.4.2 Marginal Probability Density Function of 2D Continuous random variables


R∞
The Marginal probability density function of X is f (x) = f (x, y)dy.
−∞
R∞
The Marginal probability density function of Y is f (y) = f (x, y)dx.
−∞

2.4.3 Conditional Probability of 2D Continuous random variables


f (x, y)
The condition probability of X given Y is f (x/y) = .
f (y)
f (x, y)
The condition probability of Y given X is f (y/x) = .
f (x)

Joint cumulative distribution function of 2D continuous r.v.

For the two-dimensional continuous random variable (X, Y), the


cumulative distribution function is

F(x, y) = P[X ≤ x, Y ≤ y]
Zy Zx
= f (x, y) dx dy
−∞ −∞

2.4.4 Independence of of 2D Continuous random variables

If X and Y are independent, then


f (x, y) = f (x). f (y).
156 MA6453 Probability and Queueing Theory

2.4.5 Examples of Joint, Marginal, conditional and independence of 2D Continuous r.v.

Z Z
1. f (x, y) dx dy = 1.[Use it for Checking JPDF or to find an unknown in JPDF]
RX RY

R∞
2. f (x) = f (x, y) dy. [Use it to find MPDF of X]
−∞

R∞
3. f (y) = f (x, y) dx. [Use it to find MPDF of Y]
−∞

Rx
4. F(x) = f (x) dx. [Use it to find CDF of X]
−∞

Rx
5. F(y) = f (x) dx. [Use it to find CDF of Y]
−∞

Rb
6. P(a < X < b) = f (x) dx [Use it to find Probability of X]
a

Rd
7. P(c < Y < d) = f (y) dy [Use it to find Probability of Y]
c

Rd Rb
8. P(a < X < b, c < Y < d) = f (x, y) dx dy [Use it to find joint
c a
Probability of X and Y]
Z
9. E (X r ) = xr f (x) dx. [Use it for rth moment of X]
RX
Z
10. E (Y ) =
r
yr f (y) dy. [Use it for rth moment of Y]
RY
 
11. Var(X) = E X 2 − [E(X)]2 . [Use it for Variance of X]
 
12. Var(Y) = E Y 2 − [E(Y)]2 . [Use it for Variance of Y]
f (x, y)
13. f (x/y) = [Use it for the condition probability of X given Y]
f (y)
f (x, y)
14. f (y/x) = [Use it for the condition probability of Y given X]
f (x)
Two - Dimensional Random Variables 157

Example 2.9. The bivariant random variable X and Y has the p.d.f.
f (x, y) = Kx2 (8 − y) , x < y < 2x, 0 ≤ x ≤ 2. Find
(a) K (b) f (x) (c) f (y) ! (d) f (x/y)
1 3
(e) f (y/x) (f) f (y/x = 1) (g) P ≤X≤
4 4
Solution: Given joint pdf of X and Y is f (x, y) = K x2 (8 − y), x < y < 2x, 0 ≤
x < 2. (1)

Z∞ Z∞
W.K.T., f (x, y)dxdy = 1

∵ f (x, y) han an unknown K


−∞ −∞
Z2 Z2x

K x2 (8 − y) dy dx = 1 [∵ y is a function of x]
0 x
Z2 !2x
y2
K 2
x 8y − dx = 1
2 x
0
Z2
4x2 x2
" ! !#
K x2 16x − − 8x − dx = 1
2 2
0
Z2 2
!
x
K x2 16x − 2x2 − 8x + dx = 1
2
0
Z2 !
−3 4
K x + 8x3 dx = 1
2
0
#2
3 x5 x4
"
K − +8 =1
25 4 0
" ! #
3 32 8
K − + (16) = 1
2 5 4
" #
−48
K + 32 = 1
5
!
112
K =1
5
5
∴K=
112

5 2
∴ (1) ⇒ f (x, y) = x (8 − y) , x < y < 2x, 0 ≤ x ≤ 2. (2)
112
158 MA6453 Probability and Queueing Theory

(b) f (x) :
Z∞
f (x) = f (x, y)dy
−∞
Z2x Z2x
5 2 5 2
= x (8 − y) dy = x (8 − y) dy
112 112
x x
(∵ integration w.r.t. y, so treat remaining as constant)
!2x
5 2 y2
= x 8y −
112 2 x
4x2 x2
" ! !#
5 2
= x 16x − − 8x −
112 2 2
4x2
" #
5 2
= x 8x − 3
112 2
5 h i
= 16x − 3x , 0 < x < 2
3 4
224
(b) In Region R1 , f (y) :
Z∞
f (y) = f (x, y)dx
−∞
Zy
5 2
= x (8 − y) dx
112 Y Y = 2X
y
2 1
5
[Refer region R1 ] (2, 4)
4 X=Y
Zy
5
= (8 − y) x2 dx 3 R2
112
y (2, 2)
2 R1
2
" 3 #y
5 x 1
= (8 − y)
112 3 y X
0 1 2 3 0
4
" 3 !2
y3
!#
5 y
= (8 − y) −
112 3 24
" 3#
5 7y
= (8 − y)
112 24
5 3
= y (8 − y) , 0 < y < 2
384
In Region R2 , f (y) :
Two - Dimensional Random Variables 159

Z∞
f (y) = f (x, y)dx
−∞
Z2
5 2
= x (8 − y) dx [Refer region R2 ]
112
y
2
Z2
5
= (8 − y) x2 dx
112
y
2
" 3 #2
5 x
= (8 − y)
112 3 y
" !2
y3
!#
5 8
= (8 − y) −
112 3 24
5 h i
= (8 − y) 64 − y3 , 2 < y < 4
112
(d) f (x/y) :
f (x, y)
f (x/y) =
f (y)
5 h 2
 i
 2


 8x − x y



 112 , in R1
5 3



y (8 − y)



= 384


 5 h 2 2
i
8x − x y



112

, in R2






 5  3



 (8 − y) 64 − y
112
(e) f (y/x) :
f (x, y)
f (y/x) =
f (x)
5 h 2 i
8x − x2 y
= 112
5
16x3 − 3x4

224
h i
2 8x2 − x2 y
=
16x3 − 3x4


(f) f (y/x = 1) :
160 MA6453 Probability and Queueing Theory

f (x = 1, y)
f (y/x = 1) =
f (x = 1)
5  
8−y
= 112
5
(16 − 3)
224
2 
=

8−y
13
!
1 3
(g) P ≤ X ≤ :
4 4
! Z3/4
1 3
P ≤X≤ = f (x)dx
4 4
1/4
Z3/4
5  3 
= 16x − 3x4 dx
224
1/4
!3/4
5 3
= 4x4 − x5
224 5 1/4
= 0.0247366769

Example 2.10. The joint PDF of random variables X and Y is given


by f (x, y) = Kxye−(x +y ) ,x > 0, y > 0. Find the value of K. Prove that X
2 2

and Y are independent.


Solution: Here X and Y are c.r.v with jpdf with unknown K.
Z∞ Z∞
W.K.T, f (x, y)dxdy = 1
−∞ −∞

Z Z∞

K xye−(x +y ) dxdy = 1
2 2

 ∞0 0  ∞ 
Z  Z 
−x2 −y2 
K  xe dx  ye dy = 1 (1)
  
  
0 0

R∞ 2
Consider the integral I = xe−x dx
0

Let x2 = t
when x = 0, t = 0
2xdx = dt
when x → ∞, t → ∞
xdx = dt/2
Two - Dimensional Random Variables 161

∴ The integral becomes


Z∞ Z∞
2 dt
I= xe−x dx = e−t
2
0 0
" #∞
1 e−t
=
2 −1 0
" #
1 1 h i
= 0− ∵ e = 0, e = 1
−∞ 0
2 −1
Z∞
2 1
I= xe−x dx = (2)
2
0
Z∞
2 1
Similarly, ye−y dy = (3)
2
0

From (2) and (3),


! !
1 1
(1) ⇒ K =1
2 2
K=4
∴ f (x, y) = 4xye−( x +y ) , x > 0, y > 0
2 2

Marginal probability density function of X is


Z∞
f (x) = f (x, y)dy
−∞
Z∞
4xye−(x +y ) dy
2 2
=
0
Z∞
2 2
= 4xe−x ye−y dy
0
!
2 1
= 4xe−x [∵ by (3)]
2
2
f (x) = 2xe−x , x > 0

Marginal probability density function of Y is.


Z∞
f (y) = f (x, y)dx
−∞
162 MA6453 Probability and Queueing Theory

Z∞
4xye−(x +y ) dx
2 2
=
0
Z∞
2 2
= 4ye−y xe−x dx
0
!
−y2 1
= 4ye [∵ by (2)]
2
Independent of X and Y:
2
f (y) = 2ye−y , y > 0
2 2
  
f (x). f (y) = 2xe−x 2ye−y
= 4xye−( x +y )
2 2

f (x). f (y) = f (x, y)

Hence X and Y are independent.

Example 2.11.  Let X and Y be two random variables with joint


8
 xy , 1 < x < y < 2



pdf f (x, y) = 
 k
0 , otherwise

(i) Compute the k value.

(ii) Find the Marginal values.

(iii) Are X and Y independent?

Solution: Here X and Y are c.r.v. with jpdf with unknown k.


(i) To find k
Z∞ Z∞
W.K.T., f (x, y)dxdy = 1
−∞ −∞
Z2 Zx
8xy
= dydx = 1
k
1 2
Z2 !2
8 xy2
= dx = 1
k 2 x
1
Z2
x3
!
8
= 2x − dx = 1
k 2
1
Two - Dimensional Random Variables 163

#2
8 2x2 x4
"
= − =1
k 2 8 1
" #
8 1 1
= (4 − ) − (1 − ) = 1
k 8 8
9
=1
k
k=9

8
 xy , 1 < x < y < 2


∴ f (x, y) = 

9
 0 , otherwise


(ii) To find marginal values of X and Y

Marginal density of X Marginal density of Y


Z∞ Z∞
f (x) = f (x, y)dy f (y) = f (x, y)dx
−∞ −∞
Z2 Zy
8xy 8xy
= dy = dx
9 9
x 1
" #2 !y
8x y2 8y x2
= =
9 2 x 9 2 1
8y y2 1
!
8x 4 x2
" #
= − = −
9 2 2 9 2 2
4x  4y  2 
f (y) = y − 1 ,1 < y < 2

f (x) = 4 − x2 , 1 < x < 2
9 9
(iii) Independent of X and Y
Here, f (x). f (y) , f (x, y)
Hence X and Y are not independent.
Example 2.12. The joint pdf of a 2 - dimensional Random Variable
x2
(X, Y) is given by f (x, y) = xy2 + , 0 ≤ x ≤ 2, 0 ≤ y ≤ 1. Compute
8
(i) P(X > 1)
(ii) P(Y < 1/2)
(iii) P(X > 1, Y < 1/2)
(iv) P(X < Y)
(v) P(X + Y ≤ 1)
164 MA6453 Probability and Queueing Theory

Solution: Here X and Y are c.r.v with jpdf


x2
f (x, y) = xy + , 0 ≤ x ≤ 2, 0 ≤ y ≤ 1
2
8
Z1 Z2
(i) P(X > 1) = f (x, y)dxdy
0 1
Z1 Z2
x2
= (xy2 + )dxdy
8
0 1
Z1
x 2 y2 x 3 2
= ( + )1 dy
2 24
0
Z1 " 2
#
1 y 1
= (2y2 + ) − ( + ) dy
3 2 24
0
Z1 !
3 2 7
= y + dy
2 24
0
#1
3 y3
" !
7
= + y
2 3 24 0
1 7
= +
2 24
19
P(X > 1) =
24
Z1/2Z2
(ii) P(Y < 1/2) = f (x, y)dxdy
0 0
Z1/2Z2
x2
!
= xy +
2
dxdy
8
0 0
Z1/2
x 2 y2 x 3
!
= + dy
2 24
0
Z1/2 !
1
= 2y + dy 2
3
0
" 3 #1/2
2y 1
= + y
3 3 0
Two - Dimensional Random Variables 165

" #
1 1
= +
12 6
3
=
12
1
P(Y < 1/2) =
4

Z1/2Z2
(iii) P(X > 1, Y < 1/2) = f (x, y)dxdy
0 1
Z1/2Z2
x2
= (xy2 + )dxdy
8
0 1
Z1/2 !2
x 2 y2 x 3
= + dy
2 24 1
0
Z1/2"
y2
! !#
1 1
= 2y + −
2
+ dy
3 2 24
0
Z1/2!
3 2 7
= y + dy
2 24
0
! #1/2
3 y3
" !
7
= + y
2 3 24 0
" ! ! #
3 1 7
= + −0
2 24 48
3 7
= +
48 48
5
P(X > 1, Y < 1/2) =
24

Z1 Zy
(iv) P(X < Y) = f (x, y)dxdy
0 0
Z1 Zy "
x2
!#
= xy2 + dxdy
8
0 0
Z1 !y
x2 y2 x3
= + dy
2 24 0
0
166 MA6453 Probability and Queueing Theory

Z1
y4 y3
!
= + dy
2 24
0
" 5 #1
y y4
= +
10 96 0
1 1
= +
10 96
53
=
480
53
P(X < Y) =
480

(v) P(X + Y ≤ 1) = P(X < 1 − Y)


Z1 Z1−y
= f (x, y)dxdy
0 0
Z Z1−y
1
x2
!!
= xy +
2
dxdy
8
0 0
Z1 !1−y
x 2 y2 x 3
= + dy
2 24 0
0
Z1 "
(1 − y)2 2 (1 − y)3
#
= y + dy
2 24
0
y + y4 − 2y3 1 − 3y + 3y2 − y3
Z " 2 ! !#
= + dy
2 24
!1 !1
1 y3 y5 2y4 1 3y2 3y3 y4
= + − + y− + −
2 3 5 4 0 24 2 3 4 0
! !
1 1 1 2 1 3 1
= + − + 1− +1−
2 3 5 4 24 2 4
13
P(X + Y ≤ 1) =
480

1
Example 2.13. If the joint pdf of X and Y is given by f (x, y) = (6 −
8
x − y), 0 < x < 2, 2 < y < 4. Find
(a) P(X < 1, Y < 3) (b) P(X + Y < 3) (c) P(X + Y > 3)
(d) P(X < 1/Y < 3) (e) f (x/y = 1)
Two - Dimensional Random Variables 167

Solution: Here X and Y are c.r.v. with jpdf


1
f (x, y) = (6 − x − y), 0 < x < 2, 2 < y < 4
8
.
(a) P(X < 1 ∩ Y < 3) = P(X < 1, Y < 3)
Z3 Z1 Z3 " #1
1 1 x2
= (6 − x − y)dxdy = 6x − − xy dy
8 8 2 0
2 0 2
Z3 #3
y y2
" # " " ! !#
1 1 1 1 36 − 3 − 9 24 − 2 − 4
= 6 − − y dy = 6y − − = −
8 2 8 2 2 2 8 2 2
2
1 1 1 3
= [36 − 3 − 9 − 24 + 2 + 4] = [42 − 36] = ×6=
16 16 16 8

(b) P(X + Y < 3)


Z3 Z3−y
1
= (6 − x − y) dx dy
8
2 0
Z3 #3−y
x2
"
1
= 6x − − xy dy
8 2 0
2
Z3 " Y
(3 − y)2
#
1
= 6(6 − y) − − y(3 − y) dy
8 2
2
Z3 " y=2
(3 − y)2
#
1
= 18 − 6y − − 3y − y2 dy
8 2
2
Z3 " X
(3 − y)2
#
1 O
= 18 − 9y + y −
2
dy
8 2 x+y=3
2
#3
y2 y3 (3 − y)3
"
1
= 18y − 9 + +
8 2 3 6 2
" #
1 9 27 − 8 0 − 1
= 18(3 − 2) − (9 − 4) + +
8 2 3 6
" #
1 162 − 135 − 16 − 1
=
8 6
5
=
24
168 MA6453 Probability and Queueing Theory

(c) P(X + Y > 3) :


P(X + Y > 3) = 1 − P(X + Y < 3)
5
=1−
24
19
=
24
(d) P(X < 1/Y < 3) :
P(X < 1, Y < 3)
P(X < 1/Y < 3) = (2)
P(Y < 3)
Z3
where, P(Y < 3) = f (y)dy (3)
2

Z∞
Now, f (y) = f (x, y)dx
−∞
Z2
1
= (6 − x − y)dx
8
0
Z2
1
= (6 − x − y)dx
8
0
!2
1 x2
= 6x − − xy
8 2 0
1
=

(12 − 2 − 2y) − ((0 − 0 − 0))
8
1
=

10 − 2y
8
1
=

5−y
4

Z3
1
∴ (3) ⇒ P(Y < 3) =

5 − y dy
4
2
#3
y2
"
1
= 5y −
4 2 2
" ! #
1 9
= 15 − − (10 − 2)
4 2
" #
1 9
= 7−
4 2
Two - Dimensional Random Variables 169

" #
1 5
=
4 2
5
=
8
P(X < 1, Y < 3) 3/8 3
∴ (2) ⇒ P(X < 1/Y < 3) = = = [∵ by (a) and (3)]
P(Y < 3) 5/8 5
(e) f (x/y = 1) :
" #
f (x, y)
f (x/y = 1) =
f (y) y=1
1 
 (6 − x − y) 
=  8
 

 1 
(5 − y) 
4 y=1
" #
1 5−x
=
2 4
5−x
=
8

Example 2.14. Given joint p.d.f. f (x, y) = 2, 0 < y < x < 1. Find
(a) Marginal p.d.f. of X (b) Marginal p.d.f. of Y
(c) Cond. prob. of X given Y (d) Cond. prob. of Y given X
(e) Are X and Y independent?
Solution : Given joint p.d.f. f (x, y) = 2, 0 < y < x < 1. (1)

(a) Marginal p.d.f. of X


Z∞
f (x) = f (x, y)dy Y
−∞
Zy=x x=y
= 2dy (∵ by (1))
y=0 y=2
= 2[y]y=x
y=0 X
= 2[(x) − (0)] O
∴ f (x) = 2x, 0 < x < 1
(use extreme limits)
(b) Marginal p.d.f. of Y
Z∞
f (y) = f (x, y)dx
−∞
170 MA6453 Probability and Queueing Theory

Zx=1
= 2dy (∵ by (1))
x=y
h i x=1
=2 x
x=y
= 2[(1) − (y)]
∴ f (y) = 2(1 − y), 0 < y < 1 (use extreme limits)

(c) Conditional probability of X given Y


f (x, y)
f (x/y) =
f (y)
2
= (∵ by (1) and (b))
2(1 − y)
1
∴ f (x/y) = ,0 < y < 1
1−y
(d) Conditional probability of Y given X
f (x, y)
f (y/x) =
f (x)
2
= (∵ by (1) and (a))
2x
1
∴ f (x/y) = , 0 < y < 1
x
(e) Are X and Y independent?
If X and Y are independent, then f (x, y) = f (x) · f (y)
If X and Y are not independent, then f (x, y) , f (x) · f (y)

Here f (x, y) = 2 (∵ by (1))


f (x) · f (y) = 2x · 2(1 − y)
= 4x(1 − y)
, f (x, y)

∴ X and Y are not independent.

Example 2.15. A gun is aimed at a certain point (origin of


coordinate system), because of the random factors, the actual hit
point can be any point (x, y)in a circle of radius R about the
origin. Assume that the joint density function of X and Y is
c, x + y2 ≤ R2
( 2
constant in this circle and given by f (x, y) = .
0, otherwise
Two - Dimensional Random Variables 171

 r  x 2
 2


, −R ≤ X ≤ R

(a) Find c (b) show that f (x) = 
 1−
 πR R
0 , otherwise



Solution : Given that the joint density function of X and Y is constant in
this circle and given by

c , x2 + y2 ≤ R2
(
f (x, y) = (1)
0 , otherwise

(a) Value of c :
Z∞ Z∞
W.K.T. f (x, y)dxdy = 1 (∵ f (x, y) contains an unknown)
−∞ −∞
Z Z
cdx = 1

πR2 c = 1
1
∴c=
πR2

1
, x2 + y2 ≤ R2


∴ (1) ⇒ f (x, y) = 

πR 2 (2)

0 , otherwise



(b) Proof of f (x) :
Z∞
f (x) = f (x, y)dy
−∞

ZR2 −x2
1
= dy

πR2
− R2 −x2

ZR2 −x2
2
= dy (∵ constant fn. which is an even function)
πR2
0
2 h √ 2  i
= 2 R − x2 − (0)
πR r
  x 2 

 1 − 
2  R 

= 2 
πR 

R 
 
r  x 2
2
= 1− , −R ≤ X ≤ R.
πR R
172 MA6453 Probability and Queueing Theory

Example 2.16. Define joint probability distribution function of


two random variables X and Y and state its properties.
Solution: Joint probability density function of two random variables X
and Y :
We define the probability of the joint event X = x, Y = y, which is a
function of the numbers x and y, by a joint probability distribution
function and denote it by the symbol F XY (x, y). Hence
F XY (x, y) = p {X ≤ x, Y ≤ y}
Properties of the joint distribution :

(i) F X,Y (−∞, y) = F X,Y (x, −∞) = F X,Y (−∞, −∞) = 0

(ii) F X,Y (∞, ∞) = 1

(iii) 0 ≤ F X,Y (x, y) ≤ 1

(iv) F X,Y (x, y) is a non-decreasing function of x and y.

(v) F X,Y (x, ∞) = F X (x) and F X,Y (∞, y) = FY (y)

(vi) P(a < X < b, c < Y < d) = F X,Y (b, d) + F X,Y (a, c) − F X,Y (a, d) − F X,Y (b, c)

(vii) P(a < X < b, Y ≤ y) = F(b, y) − F(a, y)

(viii) P(X ≤ x, c < Y < d) = F(x, d) − F(x, c)


∂2 ∂2
(ix) At points of continuity : f (x, y) = [F(x, y)] or [F(x, y)]
∂x∂y ∂y∂x
For a given function to be valid joint distribution function of two
dimensional r.v.s X and Y, it must satisfy the properties (i),(ii) and (vi).

Example 2.17. The joint probability density function of a


k(x + y), 0 < x < 2, 0 < y < 2
(
bivariate r.v. (X, Y) is f XY (x, y) = . Find
0, otherwise
k.
Solution: By the property of joint p.d.f. we have
"
f (x, y) dxdy = 1
R
Z2 Z2
k (x + y) dxdy = 1
0 0
Z2 " # x=2
x2
k + xy dy = 1
2 x=0
0
Two - Dimensional Random Variables 173

Z2
2 + 2y dy = 1
 
k
0
1
k=
8
Example 2.18. The joint p.d.f. of 2 random variables X and Y is
f (x, y) = cx(x − y), 0 < x < 2, −x < y < x. Find c.
Solution: By the property of jpdf, we have
Z∞ Z∞
f (x, y)dxdy = 1
−∞ −∞
Z2 Zx
cx (x − y) dydx = 1
0 −x
1
c=
8
Example 2.19. Find k if the joint probability density function of
a bivariate random variable (X, Y) is given by
k (1 − x) (1 − y) , 0 < x, y < 1
(
f (x, y)
0, otherwise
Solution: By the property of joint p.d.f. we have
"
f (x, y) dxdy = 1
R
Z1 Z1
k (1 − x) (1 − y) dxdy = 1
0 0
Z2 # x=1
x2 x2 y
"
k x − − xy + dy = 1
2 2 x=0
0
Z1 " #
1 y
k − y + dy = 1
2 2
0
k=4
Example ( 2.20. If X and Y have joint p.d.f.
x + y, 0 < x < 1, 0 < y < 1
f (x, y) = . Check whether X and Y are
0, otherwise
independent.
Solution: If X and Y are independent then f (x, y) = f (x) · f (y)
The marginal density function of X is
174 MA6453 Probability and Queueing Theory

R∞ R1 1
f (x) = f (x, y)dy = (x + y)dy = x +
−∞ 0 2
The marginal density function of Y is
R∞ R1 1
f (y) = f (x, y)dx = (x + y)dx = y +
−∞ 0 2

! !
1 1
∴ f (x) f (y) = x + y+
2 2
, x + y = f (x, y)
i.e., f (x) · f (y) , f (x, y)

Hence X and Y are not independent.


1 − λe−λ(x+y) , if x > 0, y > 0
(
Example 2.21. For λ > 0, let F (x, y) = .
0, otherwise
Check whether F can be the joint probability distribution
function of two random variables X and Y.
Solution:
1 − λe−λ(x+y) , if x > 0, y > 0
(
F(x, y) =
0, otherwise
∂ λe , if x > 0, y > 0
( 2 −λ(x+y)
F (x, y) =
∂y 0, otherwise
∂2 −λ3 e−λ(x+y) , if x > 0, y > 0
(
F y) =
(x,
∂x∂y 0, otherwise

∂2
Since F (x, y) < 0, F is not a joint probability distribution function.
∂x∂y
Example 2.22. Find the marginal density function of X and Y if
f (x, y) = 2 (2x + 5y) /5, 0 ≤ x ≤ 1, 0 ≤ y ≤ 1.
Solution: The marginal density function of X is given by
R∞ R1 2 4x + 5
fX (x) = f (x) = f (x, y)dy = (2x + 5y)dy = .
−∞ 0 5 5
The marginal density function of Y is given by
R∞ R1 2 2 + 10y
fY (y) = f (y) = f (x, y)dx = (2x + 5y)dx =
−∞ 0 5 5
Example 2.23. Can the joint distributions of two random
variables X and Y be got if their marginal distributions are
Two - Dimensional Random Variables 175

known?
Solution: We know that fXY (x, y) = fX (x) fY (y)
Two joint distributed random variables X and Y are statistically
independent of each other, if and only if the joint probability density
function is equals the product of the two marginal probability density
functions.

2.5 Expectation involving two or more random variables

Recall the concept of Expectation of one and two dimensional random


variable

xi P (xi ) , if X is a discrete random variable


 P




 ∀i
 Z∞

E(X) = X = 





 x f (x) dx, if X is a continuous random variable


−∞
yi P (yi ) , if Y is a discrete random variable
 P




 ∀i
 Z∞

E(Y) = Y = 

y f (y) dy, if Y is a continuous random variable







−∞
xy P (x, y) , if X and Y are discrete r.variables
 PP




 ∀x ∀y
 Z∞ Z∞

E(XY) = 

xyf (x, y)dxdy, if X and Y are continuous r.variables







−∞ −∞

2.5.1 Examples of Expectation involving two or more discrete r.v.s

Example 2.24. Let X and Y have the following joint probability


distribution
X
H
HH
H 2 4
Y HH
HH
1 0.10 0.15
3 0.20 0.30
5 0.10 0.15

(a) Find E(X), E(Y) and E(XY).

(b) Find E(X − 2Y).


176 MA6453 Probability and Queueing Theory

(c) Show that X and Y are independent.

Solution: Marginal distribution of the above given joint distribution is

X
H
PY (y) = P∗ j
HH
H 2 4
Y HH
HH
1 0.10 0.15 0.25
3 0.20 0.30 0.50
5 0.10 0.15 0.25
PX (x) = Pi∗ 0.40 0.60 1

X
(a) E(X) = xi P (X = xi )
i
= 2(0.40) + 4(0.60)
= 0.8 + 2.4
= 3.2
X  
E(Y) = y jP Y = y j
j

= 1(0.25) + 3(0.50) + 5(0.25)


= 0.25 + 1.5 + 1.25
=3
XX  
E(XY) = xi y j P X = xi , Y = y j
j j

= (2)(1)(0.10)+(2)(3)(0.20)+(2)(5)(0.10)
+(4)(1)(0.15)+(4)(3)(0.30)+(4)(5)(0.15)
= 9.6

(b) E(X − 2Y) = E(X) − 2E(Y) [∵ by the property of expectation]


= 3.2 − 2(3)
= −2.8

(c) If X and Y are to be independent, we have prove

Pi j = Pi∗ × P∗ j , ∀i, j

Now, P21 = 0.10, P2∗ = 0.40, P∗1 = 0.25


∴ P21 = P2∗ × P∗1
Two - Dimensional Random Variables 177

Similarly,
P23 = P2∗ × P∗3
P25 = P2∗ × P∗5
P41 = P4∗ × P∗1
P43 = P4∗ × P∗3
P45 = P4∗ × P∗5
∴ X and Y are independent.

2.5.2 Examples of Expectation involving two or more continuous r.v.s

Example 2.25. Let ( X and Y be random variables with joint density


4xy, 0 ≤ x ≤ 1, 0 ≤ y ≤ 1
function f XY (x, y) = . Find E(XY).
0, otherwise
Solution: Z Z ∞ ∞
E [XY] = xy f (x, y)dxdy
Z−∞ −∞
1Z 1
= xy (4xy) dxdy
0 0
Z 1Z 1
=4 x2 y2 dxdy
0 0
4
=
9
Example 2.26. Two random variable X and Y have joint
xy
, 0 < x < 4, 1 < y < 5



probability density function f (x, y) = 

 96 .
0, otherwise


Find E[XY].
Solution: Z Z ∞ ∞
E [XY] = xy f (x, y)dxdy
Z−∞ −∞
5Z 4  xy 
= dxdyxy
1 96
0
"Z 4 # "Z 5 #
1
= 2
x dx 2
y dy
96 0 1
" 3 #4 " 3 #5
1 x y
=
96 3 0 3 1
" #" #
1 64 − 0 125 − 1
=
96 3 3
248
=
27
178 MA6453 Probability and Queueing Theory

2.6 Co - Variance

Let X and Y be two dimensional random variables then the co-variance


between X and Y is given by
nh ih io
Cov(X, Y) = E X − E(X) Y − E(Y)

Note:
nh ih io h i
E X − E(X) Y − E(Y) = E XY − XE(Y) − Y E(X) + E(X)E(Y)
= E(XY) − E(X)E(Y) − E(Y)E(X) + E(X)E(Y)
= E(XY) − E[X] · E[Y].

Properties :
1. Cov(X, Y) = E[XY] − E[X] · E[Y].
  h i2
2. Cov(X, X) = E[XX] − E[X] · E[X] = E X 2 − E(X) .
3. Cov(X + a, Y + b) = Cov(X, Y). (1)
Proof:

LHS of (1) = E[(X+a)(Y +b)]−E[X+a] × E[Y +b]


= E[XY +bX+aY +ab]− (E[X]+a)(E[Y]+b)
= E[XY]+bE[X]+aE[Y]+ab−E[X]E[Y]−bE[X]−aE[Y]−ab
= Cov(X, Y)
= RHS of (1)

4. Cov(aX, bY) = abCov(X, Y).


5. If X and Y are independent, Cov(x, y) = 0.
6. Var(X + Y) = Var(X) + Var(Y) + 2Cov(X, Y).
7. Var(X − Y) = Var(X) + Var(Y) − 2Cov(X, Y).

2.7 Correlation

The two variables vary such that the change in one variable affects the
change in another variable. Then the variables are said to be correlated.
Correlation analysis is a statistical tool used to measure the degree of
relationship between any two variables.
Types of Correlation :
Positive Correlation : If the increase in one variable results in
Two - Dimensional Random Variables 179

corresponding increase in the other (or) decrease in one variable results


in corresponding decrease in other. This is called positive or direct
correlation.
Eg:
Best learning and Best marks.
Height and weight of a person.
The more time you spend running on a treadmill, the more calories you
will burn.
Negative Correlation : If the increase in one variable results in
corresponding decrease in other (or) decrease in one variable results in
corresponding increase in other is called negative or inverse correlation.
Eg:
More production and less price of a commodity.
Alcohol consumption and driving ability.
The more vitamins one takes, the less likely one is to have a deficiency.
Simple and Multiple Correlation : If only two variables are
considered for correlation analysis then it is called simple correlation.
Eg: The relation between rainfall(1) and yield(2) of crops.
If more than two variables are considered for correlation analysis then
it is called multiple correlation.
Eg: The relation between rainfall(1), usage of fertilizer(2) and yield(3)
of crops.
Linear or Non-linear correlation : The correlation between two
variables is said to be linear if for a unit change in one variable there is
a constant corresponding change in another.
The correlation between two variables is said to be non-linear if the
amount of change in one variable does not bear a constant rate to the
amount of change in the other.
Karl Pearson’s Co-efficient of Correlation :
The coefficient of correlation between two random variables X and Y is
denoted by r(X, Y) or rXY or ρXY and is defined by

Cov(X, Y)
r(X, Y) = , σ x , 0σy , 0
σ x σy

Note:

Cov(X, Y)
r(X, Y) =
σ x σy
E(XY) − E(X)E(Y)
= p p
E X 2 − [E(X)]2 E Y 2 − [E(Y)]2
 
180 MA6453 Probability and Queueing Theory

If X and Y are discrete random variables, then


P P P
XY X Y

r(X, Y) = s n n n (or)
s
P 2 P !2 P 2 P !2
X X Y Y
− −
n n n n
1P    
XY − X Y
= r n
r
1 P 2  2 1 P 2  2
X − X Y − Y
n n
If X and Y are continuous random variables, then
R∞ R∞
"∞ #" ∞ #
R R
xy f (x, y) dx dy − x f (x) dx y f (y) dy
−∞ −∞ −∞ −∞
r(X, Y) = s #2 s #2
R∞ R∞ R∞ R∞
" "
x2 f (x) dx − x f (x) dx y2 f (y) dy − y f (y) dy
−∞ −∞ −∞ −∞

Properties :

1. r lies between −1and1 i.e, −1 ≤ r ≤ 1.

2. If X and Y are independent, then r(X, Y) = 0

3. If r(X, Y) = 0, then X and Y are uncorrelated.

4. Correlation co-efficient is independent of change of origin and scale.

Note:

1. If r(x, Y) = 0, then no correlation between X and Y.

2. r(x, Y) > 0, then +ve correlation between X and Y.

3. r(x, Y) < 0, then −ve correlation between X and Y.

2.7.1 Examples of Covariance and Correlation of 2D Discrete r.v.

Example 2.27. Find the correlation between the random


variables X and Y for the given data.

X 9 8 7 6 5 4 3 2 1
Y 15 16 14 13 11 12 10 8 9
Two - Dimensional Random Variables 181

Solution: Here X and Y are discrete r.v.


W.K.T., the correlation coefficient is
Cov(X, Y)
r(X, Y) =
σ x σy
1P   
XY − X Y
= r n (1)
r
1P 2  2 1P 2   2
X − X Y − Y
n n
P P
X Y
where X = and Y = .
n n
For this, we have find X, Y, XY, X 2 , Y 2 and the corresponding
P P P P P
table is
S.No. X Y XY X2 Y2
1 9 15 135 81 225
2 8 16 128 64 256
3 7 14 98 49 196
4 6 13 78 36 169
5 5 11 55 25 121
6 4 12 48 16 144
7 3 10 30 9 100
8 2 8 16 4 64
9 1 9 9 1 81
X = 45 Y = 108 XY = 597 X 2 = 285 Y 2 = 1356
P P P P P
Total

n=9
1X
X= X
n
1
= (45)
9
=5
1X
Y= Y
n
1
= (108)
9
= 12
1X   
Cov(X, Y) = XY − X Y
n
182 MA6453 Probability and Queueing Theory

1
= (597) − (5)(12)
9
= 6.3333
r
1 X 2  2
σx = X − X
r n
1
= (285) − 25
√ 9
= 6.6
= 2.582
r
1 X 2  2
σy = Y − Y
r n
1
= (1356) − 144
√ 9
= 6.6
= 2.582
Cov(X, Y)
∴ (1) ⇒ r(x, y) =
σ x σy
6.3333
=
(2.582)(2.582)
r(X, Y) = 0.9499

Example 2.28. Find the correlation co-efficient for the following


heights(inches) for the father and sons.

X 65 66 67 67 68 69 70 72
Y 67 68 65 68 72 72 69 71
Solution:

X Y XY X2 Y2
65 67 4355 4225 4489
66 68 4488 4356 4624
67 65 4355 4489 4225
67 68 4556 4489 4624
68 72 4896 4624 5184
69 72 4968 4761 5184
70 69 4830 4900 4761
72 71 5112 5184 5041
X = 544 Y = 552 XY = 37560 X = 37028 Y = 38132
P P P P 2 P 2
Two - Dimensional Random Variables 183

1
X= × 544 = 68
8
552
Y= = 69
8
1X 1
Cov(X, Y) = XY − XY = 37560 − 4692 = 3
nr 8
1
σx = × 37028 − 4624 = 2.1213
r 8
1
σy = × 38132 − 4761 = 2.3452
8
Correlation Coefficient is
Cov(X, Y) 3
r(X, Y) = = = 0.6030
σ x σy (2.1213)(2.3452)
(or)

X Y U = X − 70 V = Y − 72 UV U2 V2
65 67 -5 -5 25 25 25
66 68 -4 -4 16 16 16
67 65 -3 -7 21 9 49
67 68 -3 -4 12 9 16
68 72 -2 0 0 4 0
69 72 -1 0 0 1 0
70 69 0 -3 0 0 9
72 71 2 -1 -2 4 1
U = −16 V = −24 UV = 72 U 2 = 68 V 2 = 116
P P P P P

1X 1
U= U = (−16) = −2
n 8
1 X 1
V= V = (−24) = −3
n 8
1 X
Cov(U, V) = UV − U V
n
1
= (72) − (−2)(−3)
8
∴ Cov(U, V) = 3
r
1X 2
σu = U − (u)2
n
184 MA6453 Probability and Queueing Theory

r
1
= (68) − (−2)2
8
σu = 2.1213
r
1X 2
σv = V − (v)2
r n
1
= (116) − (−3)2
8
σv = 2.3452

∴Correlation Coefficient is

Cov(U, V) 3
r(X, Y) = r(U, V) = =
σu σv (2.1213)(2.3452)
= 0.6030

Example 2.29. The joint probability mass function of X and Y is

Y
H
HH
HH −1 1
X HH
H

0 1/8 3/8
1 2/8 2/8
Find the correlation co-efficient of X and Y.
Solution:

Y
H
P(X = xi )
HH
H −1 1
X HH
HH

0 1/8 3/8 1/2


1 2/8 2/8 1/2
P(Y = y j ) 3/8 5/8 1

The Marginal pmf of X is

X 0 1
P(x) 1/2 1/2

The Marginal pmf of Y is

Y -1 1
P(y) 3/8 5/8
Two - Dimensional Random Variables 185

X
E[X] = xP(x)
! !
1 1
=0 +1
2 2
1
=
2
X
E[X ] =
2
x2 P(x)
! !
1 1
= 02 + (1)2
2 2
1
=
2
X
E[Y] = yP(y)
= (−1) (3/8) + 1 (5/8)
1
=
4
X
E[Y ] =
2
y2 P(y)
= (−1)2 (3/8) + (1)2 (5/8)
=1
XX
E[XY] = xyp(x, y)
= x0 y0 p(x0 , y0 ) + x0 y1 p(x0 , y1 ) + x1 y0 p(x1 , y0 ) + x1 y1 p(x1 , y1 )
= 0(−1) (1/8) + 0(1) (3/8) + 1(−1) (2/8) + (1)(1) (2/8)
=0
Cov(X, Y) = EXY − E[X]E[Y]
! !
1 1
=0−
2 4
−1
Cov(X, Y) =
8
σ x = Var[X]
2

= E[X 2 ] − (E[X])2
!2
1 1
= −
2 2
1
σ2x =
4
1
σx =
2
σy = Var[Y]
2

= E[Y 2 ] − (E[Y])2
186 MA6453 Probability and Queueing Theory

!2
1
=1−
4
15
σ2y =
16

15
σy =
4
Correlation co-efficient is
Cov(x, y)
r(X, Y) =
σ x σy
−1/8
= √ 
1  15 
 
2 4
r(X, Y) = −0.2582

Note: Here X and Y are negatively correlated.


Example 2.30. Two random variables X and Y are related as Y =
4X + 9. Find the correlation coefficient between X and Y.
Solution:
Given the relation between X and Y is

Y = 4X + 9 (1)

W.K.T., the correlation coefficient is


Cov(x, y)
r(X, Y) =
σ x σy
E(XY) − E(X)E(Y)
= (2)
σ x σy
For this, we have to find the following

E(Y) = E(4X + 9)
= 4E(X) + 9
E(XY) = E[X(4X + 9)]
= E[4X 2 + 9X]
= 4E[X 2 ] + 9E[X]
Cov(X, Y) = E[XY] − E[X] · E[Y]
= 4E[X 2 ] + 9E[X] − E[X](4E[X] + 9)
= 4E[X 2 ] + 9E[X] − 4(E[X])2 − 9E[X]
= 4[E[X 2 ] − (E[X])2 ]
Two - Dimensional Random Variables 187

Cov(X, Y) = 4σ2x
σ2Y = Var[Y] = Var[4X + 9] (by the property of variance)
= 42 Var[X]
= 16Var[X]
σ2y = 16σ2x
σy = 4σ x

Correlation co-efficient is
Cov(x, y) 4σ2x
r(X, Y) = =
σ x σy σ x 4σ x
r(X, Y) = 1

which is positive and maximum correlation.


Example 2.31. If the independent random variables X and Y
have the variances 36 and 16 respectively. Find the correlation
coefficient between X + Y and X − Y.
Solution: Given X and Y are independent, which implies

E(XY) = E(X)E(Y) (1)

Given Var(X) = 36
Var(Y) = 16
Let U = X + Y
Let V = X − Y

W.K.T., the correlation coefficient between U and Y is


Cov(U, V)
r(U, V) =
σU σV
E(UV) − E(U)E(V)
= (2)
σU σV

Cov(U, V) = E(UV) − E(U)E(V)


= E[(X + Y)(X − Y)] − E[(X + Y)]E[(X − Y)]
h i
= E X 2 − Y 2 − [E(X) + E(Y)][E(X) − E(Y)]
h i h i
= E X 2 − E Y 2 − [E(X)]2 + E(X)E(Y) − E(Y)E(X) + [E(Y)]2
h   i h   i
= E X 2 − [E(X)]2 − E Y 2 − [E(Y)]2
= Var(X) − Var(Y)
= 36 − 16
188 MA6453 Probability and Queueing Theory

Cov(U, V) = 20
 
σU = E U 2 − [E(U)]2
2
h i
= E (X + Y) − [E(X + Y)]
2 2
h i
= E X 2 + Y 2 + 2XY − [E(X) + E(Y)]2
   
= E X 2 + E Y 2 + 2E(XY) − [E(X)]2 − [E(Y)]2 − 2E(X)E(Y)
h i h i
= E(X 2 ) − [E(X)]2 + E(Y 2 ) − [E(Y)]2
= Var(X) + Var(Y)
= 36 + 12
σ2U = 52

σU = 52
 
σ2V = E V 2 − [E(V)]2
h i
= E (X − Y)2 − [E(X − Y)]2
h i
= E X 2 − Y 2 + 2XY − [E(X) + E(Y)]2
   
= E X 2 + E Y 2 − 2E(XY) − [E(X)]2 − [E(Y)]2 + 2E(X)E(Y)
h i h   i
= E(X 2 ) − [E(X)]2 + E Y 2 − [E(Y)]2
(Since X and Y are independent E(XY) = E(X)E(Y))
= Var(X) + Var(Y)
= 36 + 16
σ2V = 52

σV = 52

Correlation co-efficient,

Cov(U, V) 20
r(U, V) = = √ √
σu σv 52 52
r(X + Y, X − Y) = 0.3846

Example 2.32. Let (X, Y & Z) be the uncorrelated random variable


with zero mean and standard deviations 5,12 & 9 respectively. If
U = X + Y, V = Y + Z, find the correlation coefficient between U and
V.
Solution: Given X, Y and Z are random random variables with mean is
zero. i.e.,

E(X) = 0
E(Y) = 0
Two - Dimensional Random Variables 189

E(Z) = 0

Given standard deviations of X, Y and Z are 5,12 and 9 i.e.,

σX = 0 (1)
σY = 0
σZ = 0

Given X and Y are uncorrelated, i.e.,

r(X, Y) = 0
Cov(X, Y) = 0
E(XY) − E(X)E(Y) = 0
E(XY) − (0)(0) = 0
∴ E(XY) = 0

Similarly

E(YZ) = 0
E(ZX) = 0

Now,
 
(1) ⇒ σ2X =E X − [E(X)]2 = 25
2
 
E X 2 − [0]2 = 25
 
∴ E X 2 = 25
 
Similarly, E Y 2 = 144
 
E Z 2 = 81

Now

Cov(U, V) = E(UV) − E(U)E(V)


= E[(X + Y)(Y + Z)] − E(X + Y)E(Y + Z)
h i
= E XY + XZ + Y 2 + YZ − [E(X) + E(Y)][E(Y) + E(Z)]
 
= E(XY) + E(XZ) + E Y 2 + E(YZ) − E(X)E(Y)
− E(X)E(Z) − [E(Y)]2 − E(Y)E(Z)
= 0 + 0 + 144 + 0 − (0)(0) − (0)(0) − 0 − (0)(0)
Cov(U, V) = 144
 
σ2U = E U 2 − [E(U)]2
h i
= E (X + Y)2 − [E(X + Y)]2
190 MA6453 Probability and Queueing Theory

h i
= E X 2 + Y 2 + 2XY − [E(X) + E(Y)]2
   
= E X 2 + E Y 2 + 2E(XY) − [E(X)]2 − [E(Y)]2 − 2E(X)E(X)
= 25 + 144 + 2(0) − 02 − 02 − 2(0)(0)
= 25 + 144
∴ σ2U = 169
σU = 13
 
σ2V = E V 2 − [E(V)]2
h i
= E (Y + Z) − [E(Y + Z)]2
2
 
= E X + 2YZ + Z − [E(Y) + E(Z)]2
2 2
h i  
= E Y + E Z 2 + 2E(YZ) − [E(Y)]2 − [E(Z)]2 − 2E(Y)E(Z)
2

= 144 + 81 + 2(0) − 02 − 02 − 2(0)(0)


= 144 + 81
∴ σ2V = 225
σV = 15
∴ Correlation co-efficient is
Cov(U, V) 144
r(U, V) = =
σU σV (13)(15)
r(U, V) = 0.7385
Example 2.33. Let (X, Y) be a two dimensional random variable.
Define covariance of (X, Y). If X and Y are independent, what will
be the covariance of (X, Y).
Solution: Cov(X, Y) = E[[X − E(X)][Y − E(Y)]]
= E(XY) − E(X)E(Y)
If X and Y are independent, then Cov(X, Y) = 0.
Example 2.34. If Y = −2x + 3, find the Cov(X, Y).
Solution: Given Y = −2x + 3.
W.K.T., Cov(X, Y) = E(XY) − E(X)E(Y)
= E[x(−2x + 3)] − E(x)E(−2x + 3)
 
= E −2x2 + 3x − E(x)[−2E(x) + 3]
 
= −2E x2 + 3E(x) + 2[E(x)]2 − 3E(x)
 
= −2E x2 + 2[E(x)]2
h   i
= −2 E x2 − [E(x)]2
= −2Var(x)
Two - Dimensional Random Variables 191

Example 2.35. Prove that the correlation coefficient ρ xy takes


value in the range −1 to 1.
Solution: Let X and Y be two random variables with variances σ2x , σ2y
respectively.
We know that variance of any random variable ≥ 0.
!
X Y
∴ Var + ≥0
σX σY
! ! !
X Y X Y
Var + Var + 2Cov , ≥0 [∵ by variance property]
σX σY σX σY
∵ Var (aX) = a2 Var (X) ,
( )
Var (X) Var (Y) 2
+ + Cov (X, Y) ≥ 0
σ2X σ2Y σ X σY Cov (aX, b) = abCov (X, Y)
2
1+1+ Cov (X, Y) ≥ 0
σ X σY
2 + 2ρXY ≥ 0
1 + ρXY ≥ 0
ρXY ≥ −1
!
X Y
Similarly, Var − ≥0
σX σY
2 − 2ρXY ≥ 0
1 − ρXY ≥ 0
ρXY ≤ 1

Thus ρXY ≥ −1 and


ρXY ≤ 1
Hence −1 = ρXY = 1.
i.e., correlation coefficient always lies between −1 and +1.
Example 2.36. If X and Y are random variables such that Y =
aX + b where a and b are real constants, show that the correlation
coefficient r(X, Y) between them has magnitude one.
Solution: Given : Y = aX + b
cov (X, Y)
r (X, Y) =
σ x σy
E[XY] − E[X]E[Y]
=
σ x σy
E[X(aX + b)] − E(X)E(aX + b)
=
σσ
  x y
E aX 2 + bX − E(X)[aE(X) + b]
=
σ x σy
192 MA6453 Probability and Queueing Theory

 
aE X 2 + bE(X) − a[E(X)]2 − bE(X)
=
h   σX σi Y
a E X − [E(X)]2
2
=
σ X σY
aσ2X
=
σ X σY
aσY
= (1)
σY 
σ2Y = E Y 2 − [E(Y)]2
h i
= E (aX + b) − [E (aX + b)]2
2
h i
= E a X + 2abX + b − [aE(X) + b]2
2 2 2
h i n o
= a2 E X 2 + 2abE [X] + b2 − [aE [X]]2 + b2 + 2abE [X]
h i
= a2 E X 2 + 2abE [X] + b2 − a2 [E [X]]2 − b2 − 2abE [X]
h i
= a2 E X 2 − a2 [E [X]]2
n h i o
= a2 E X 2 − [E [X]]2
= a2 σ2x
∴ σY = ±aσX
aσX
∴ (1) ⇒ r (X, Y) =
±aσX
= ±1
⇒ |r (X, Y)| = 1

Hence the correlation coefficient has magnitude one.

2.7.2 Examples of Covariance and Correlation of 2D Continuous r.v.

1
Example 2.37. If jpdf of X and Y is f (X, Y) = (6 − x − y), 0 ≤ x ≤ 2, 2 ≤
8
y ≤ 4. Find the correlation between X and Y.
Solution: Here X and Y are continuous random variables with jpdf
1
f (X, Y) = (6 − x − y), 0 ≤ x ≤ 2, 2 ≤ y ≤ 4.
8
W.K.T., the correlation coefficient between X and Y is

Cov(X, Y) E(XY) − E(X)E(Y)


r(X, Y) = = p
σ x σy
p
E X 2 − [E(X)]2 E Y 2 − [E(Y)]2
 
Two - Dimensional Random Variables 193

R∞ R∞ R∞ R∞
" #" #
xy f (x, y) dx dy − x f (x) dx y f (y) dy
−∞ −∞ −∞ −∞
= s #2 s #2
R∞ R∞ R∞ R∞
" "
x2 f (x) dx − x f (x) dx y2 f (y) dy − y f (y) dy
−∞ −∞ −∞ −∞

Marginal pdf of X
Z∞ Marginal pdf of Y
f (x) = f (x, y)dy Z∞
−∞ f (y) = f (x, y)dx
Z4 −∞
1
= (6 − x − y)dy 1
Z2
8 = (6 − x − y)dx
2 8
#4
y2
"
1 0
= 6y − xy − 1
"
x2
#2
8 2 2 = 6x − xy −
1 8 2 0
= [(24 − 4x − 8) − (12 − 2x − 2)] 1
8 = [12 − 2 − 2y]
1 8
= [−2x + 6] 1
8 ∴ f (y) = [5 − y], 2 ≤ y ≤ 4
1 4
∴ f (x) = [3 − x], 0 ≤ x ≤ 2
4

Expectation of X Expectation of Y
Z∞ Z∞
E[X] = x f (x)dx E[Y] = y f (y)dy
−∞ −∞
Z2 Z4
1 1
= x (3 − x)dx = y (5 − y)dy
4 4
0 2
Z2 Z4
1 1
= (3x − x2 )dx = (5y − y2 )dy
4 4
0 2
3 2
2
#4
1 5y2 y3
" # "
1 3x x
= − = −
4 2 3 0 4 2 3 2
" # " ! !#
1 8 1 64 8
= 6− = 40 − − 10 −
4 3 4 3 3
5 17
∴ E[X] = E[Y] =
6 6
194 MA6453 Probability and Queueing Theory

Z4 Z2
E[XY] = xy f (x, y)dxdy
2 0
Z4 Z2
1
= xy (6 − x − y)dxdy
8
2 0
Z4 Z2
1
= (6xy − x2 y − xy2 )dxdy
8
2 0
Z4  " #2
2
" 3 #2 " 2 #2 
1  x x 2 x

=
 
6y − y − y dy
 
8 2 0 3 0 2 0


 

2
Z4 ! 
2 4
! 2 4
! 3 4
!
1 8 1 y 8 y y
= 12y − y − 2y2 dy = 12

 
− −2 
8 3 8 2 2 3 2 2 3 2
"2 #
1 8 56 1 112
= 72 − × 6 − 2 × = [72 − 16 − ]
8 3 3 8 3
E[XY] = 7/3

∴ Cov(X, Y) = E[X, Y] − E[X].E[Y]


! !
7 5 17
= −
3 6 6
−1
=
36

Z∞
Z∞
E[Y 2 ] = y2 f (y)dy
E[X 2 ] = x2 f (x)dx
−∞
−∞
Z4
Z2 1
1 = (5y2 − y3 )dy
= (3x2 − x3 )dx 4
4 2
0 #4
1 5y3 y4
"
4 2
=
" #
1 3 x −
= x − 4 3 4 2
4 4 0 " ! !#
1 320 40
1
= [8 − 4] = − 64 − −4
4 4 3 3
E[X 2 ] = 1 25
=
3
Two - Dimensional Random Variables 195

σ2X = Var[X] = E[X 2 ] − (E[X])2 σ2Y = Var[Y] = E[Y 2 ] − (E[Y])2


!2 !2
5 25 17
=1− = −
6 3 6
11 11
σ2X = σ2Y =
36
√ 36

11 11
∴ σX = σY =
6 6

Cov(X, Y)
r(X, Y) =
σ X σY
−1/36
= r r
11 11
36 36
−1
=
11
r(X, Y) = −0.09091

Example 2.38. The two random variables X, Y has the following


JPDF f (x, y) = 2 − x − y, 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1. Find the correlation
co-efficient of X and Y.
Solution:

Z∞ Z∞
f (x) = f (x, y)dy f (y) = f (x, y)dx
−∞ −∞
Z1 Z1
= (2 − x − y)dy = (2 − x − y)dx
0 0
#1 #1
y2
"
x2
"
= 2y − xy − = 2x − − yx
2 2
" # 0 0
1 1
= 2−x− =2− −y
2 2
3 3
f (x) = − x, 0 ≤ x ≤ 1 f (y) = − y, 0 ≤ y ≤ 1
2 2
196 MA6453 Probability and Queueing Theory

Z∞ Z∞
E[X] = x f (x)dx E[Y] = y f (y)dy
−∞ −∞
Z1 ! Z1 !
3 3
= x − x dx = y − y dy
2 2
0 0
Z1 ! " 2 3 1
# Z1 ! " 2 3 1
#
3 3 x x 3 3 y y
= x − x2 dx = − = y − y2 dy = −
2 22 3 0 2 22 3 0
0 0
3 1 9−4 3 1 9−4
= − = =− =
4 3 12 4 3 12
5 5
E[X] = E[Y] =
12 12

Z1 Z1 Z1 Z1
E[XY] = xy f (x, y)dxdy = xy(2 − x − y)dxdy
0 0 0 0
Z1 Z1  
= 2 2
2xy − x y − xy dxdy
0 0
Z1 " 2 #1
x x3 x2 2
= 2 y − y − y dy
2 3 2 0
0
Z1 "
y y2
#
= y− − dy
3 2
0
" 2 #1
2y y3
= −
32 6 0
1 1
= −
3 6
1
∴ E[XY] =
6

Cov[X, Y] = E[XY] − E[X] · E[Y]


" #" #
1 5 5
= −
6 12 12
1 25
= −
6 144
−1
Cov[X, Y] =
144
Two - Dimensional Random Variables 197

h i Z∞ h i Z

E X2 = x2 f (x)dx E Y2 = y2 f (y)dy
−∞ −∞
Z1 ! Z1 !
3 3
= x2 − x dx = y2 − y dy
2 2
0 0
Z1 ! Z1 !
3 2 3 2
= x − x3 dx = y − y3 dy
2 2
0 0
" 2 #1 #1
x3
" 2
y3
! !
3x 3 1 1 3y 3 1 1
= − = − = − = −
22 3 0 2 3 4 22 3 0 2 3 4
1 1
= =
4 4

σ2X = Var[X] = E[X 2 ] − (E[X])2 σ2Y = Var[Y] = E[Y 2 ] − (E[Y])2


!2 !2
1 5 1 5
= − = −
4 12 4 12
1 25 1 25
= − = −
4 144 4 144
36 − 25 36 − 25
= =
144 144
11 11
= =
144
√ 144

11 11
σX = σY =
12 12

Correlation co-efficient is

Cov(x, y)
r(X, Y) =
σ x σy
−1
= √ 144 r
11 11
12 12
r(x, y) = −0.0909

Rank Correlation

If X and Y are random variables, then


198 MA6453 Probability and Queueing Theory

6 di2
P
r(X, Y) = 1 − where di = xi − yi
n(n2 − 1)
known as Spearman’s Rank correlation co-efficient.
Note: hP i
6 di + C.F.
2
For repeated Ranks r(X, Y) = 1 − ,
n(n2 − 1) 
X mi m2i − 1
where C.F. is the correction factor = where mi is the number
∀i
12
of times an item is repeated in X and Y series.
Example 2.39. Find the rank correlation co-efficient from the
following data.
Rank of X 1 2 3 4 5 6 7
Rank of Y 4 3 1 2 6 5 7
Solution: Here ranks are not repeated in X and Y.
X Y di = xi − yi di2
1 4 −3 9
2 3 −1 1
3 1 2 4
4 2 2 4
5 6 −1 1
6 5 1 1
7 7 0 0
di = 20
P 2

Here n = 7
∴ Rank correlation co-efficient for non repeated ranks is
6 di2
P
r(X, Y) = 1 −
n(n2 − 1)
6(20)
=1−
7(72 − 1)
120
=1−
7(48)
r(X, Y) = 0.6429

Example 2.40. A sample of 12 fathers and their eldest sons have


the following data about their height in inches.
Father’s 65 63 67 64 68 62 70 66 68 67 69 71
Son’s 68 66 68 65 69 66 68 65 71 67 68 70
Two - Dimensional Random Variables 199

Calculate the rank correlation coefficient.


Solution: Here ranks are repeated in X and Y.

X Y Rank of X Rank of Y di = Rank of X−Rank of Y di2


65 68 9 5.5 3.5 12.25
63 66 11 9.5 1.5 2.25
67 68 6.5 5.5 1 1
64 65 10 11.5 −1.5 2.25
68 69 4.5 3 1.5 2.25
62 66 12 9.5 2.5 6.25
70 68 2 5.5 −3.5 12.25
66 65 8 11.5 −3.5 12.25
68 71 4.5 1 3.5 12.25
67 67 6.5 8 −1.5 2.25
69 68 3 5.5 −2.5 6.25
71 70 1 2 −1 1
di = 72.5
P 2

In X series 68 is repeated 2 times, 67 for 2 times.


Let m1 = 2, m2 = 2.
In Y series 68 is repeated 4 times, 66 for 2 times, 65 for 2 times.
Let m3 = 4, m4 = 2, m5 = 2.
 
i=5 m m2 − 1
X i i
C.F. =
i=1
12
2(22 − 1) 2(22 − 1) 4(42 − 1) 2(22 − 1) 2(22 − 1)
C.F. = + + + +
12 12 12 12 12
= 0.5 + 0.5 + 5 + 0.5 + 0.5
∴ C.F. = 7

Here n = 12.
∴Rank correlation co-efficient for repeated ranks is
P 
6 di2 + C.F.
r(X, Y) = 1 −
n(n2 − 1)
6(72.5 + 7)
=1−
12(122 − 1)
477
r(X, Y) = 1 −
1716
∴ r(X, Y) = 0.7221
200 MA6453 Probability and Queueing Theory

2.8 Regression

Regression Lines :
Regression Line of Y on X is

y − y = byx (x − x)
where byx = Regression coefficient of y on x
Cov (X, Y) Cov (X, Y)
= =
Var (X) σ2x
σy
=r
σPx P P
n xy − ( x) ( y)
= · · · direct method
n x2 − ( x)2
P P

Regression Line of X on Y is

x − x = b xy (y − y)
where b xy = Regression coefficient of x on y
Cov (X, Y) Cov (X, Y)
= =
Var (Y) σ2y
σx
=r
σy
P P P
n xy − ( x) ( y)
= · · · direct method
n y2 − ( y)2
P P

Properties of Regression Lines


(i) Regression lines intersect at point (x, y)
(ii) Angle between regression lines :
2  σ σ 
 ! 
1 − r x y
Acute angle, θ = tan−1 

 
 
|r| σ x + σy
2 2 

σ σ
 2 ! 
r − 1 x y
Obtuse angle, θ0 = tan−1 

 

 
σ2x + σ2y

|r|

If r = 0 ⇒ The regression lines are perpendicular


If r = ±1 ⇒ The regression lines are parallel(or coincide)
Properties of Regression Coefficients
(i) bxy , byx
(ii) bxy and byx have same sign and if one of them is greater than 1
(numerically), then the other is less than one (numerically).
Two - Dimensional Random Variables 201

(iii) r2 = b xy byxp
i.e., r = ± b xy byx and r has the same sign as that of b xy and byx .

(iv) Regression coefficients are independent of change but not of scale and
x−a y−b
if u = , v= , then
h k

h k
b xy = buv and byx = bvu
k h

2.8.1 Regression curves of the means

1. The regression curve of X on Y is

X = E[X / Y = y]

2. The regression curve of Y on X is

Y = E[Y / X = x]

2.8.2 Examples of Regression curve lines of discrete r.v.s

Example 2.41. The following table gives age X in years if cars and
annual maintenance cost Y(in 100 Rs.)

X 1 3 5 7 9
Y 15 18 21 23 22

Find
(a) regression equations
(b) correlation coefficient
(c) Estimate the maintenance cost for a 4 year old car.
Solution :
(a) W.K.T. regression equations, i.e.,

Regression equation of Y on X is
(y − y) = byx (x − x) (1)
Regression equation of X on Y is
(x − x) = b xy (y − y) (2)

where
202 MA6453 Probability and Queueing Theory

byx = Regression coeff. of y on x b xy = Regression coeff. of x on y


Cov (X, Y) Cov (X, Y) Cov (X, Y) Cov (X, Y)
= = = =
Var (X) σ2x Var (Y) σ2y
σy σx
=r =r
σPx P P σy
n xy − ( x) ( y)
P P P
n xy − ( x) ( y)
= =
n x2 − ( x)2 n y2 − ( y)2
P P P P
X X X X X
Now, find the values of x, y, x, y, xy, x2 , y2 by the table :

# X Y XY X2 Y2
1 1 15 15 1 225
2 3 18 54 9 324
3 5 21 105 25 441
4 7 23 161 49 529
5 9 22 198 81 484
X = 25 Y = 99 XY = 533 X = 165 Y = 2003
P P P P 2 P 2
n=5
5(533) − (25)(99)
P
x 99
x= = = 19.8 byx = = 0.95
Pn 5 5(165) − (25)2
y 25 5(533) − (25)(99)
y= = =5 b xy = = 0.8878
n 5 5(2003) − (99)2
∴ Regression equation of Y on X is from (1) is
(y − 19.8) = 0.95(x − 5) ⇒ y = 0.95x + 15.05 (3)
∴ Regression equation of X on Y is from (2) is
(x − 5) = 0.8878(y − 19.8) ⇒ x = 0.8878y − 12.5784 (4)
(b) Correlation Coefficient r :
q
r = ± byx · b xy

= ± 0.8878 × 0.95
∴ r = ±0.9183
(c) Cost(Y) for 4 year old(X) car :
i.e., we have to use regression equation of Y on X. i.e.,
∴ (3) ⇒ y = 0.95x + 15.05
= 0.95(4) + 15.05
∴ y = 18.85
i.e., The cost for 4 year old car is Rs.1885.
Two - Dimensional Random Variables 203

Example 2.42. The two lines of regression are

8x − 10y + 66 = 0
40x − 18y − 214 = 0

Find
(a) mean values of X and Y (b) r (X, Y)
Solution : Given

8x − 10y + 66 = 0 (1)
40x − 18y − 214 = 0 (2)

(a) Since both the lines of regression pass through the point (x, y), where
(x) and (y) are mean values of x and y, which satisfies the two given
lines of regression.
Find the mean value point x, y by solving equations (1) and (2) : i.e.,

8x − 10y = −66 (3)


40x − 18y = 214 (4)

On solving,

x = 13
y = 17

(b) r(x, y) :
(1) ⇒ 8x − 10y + 66 = 0 (1) ⇒ 8x − 10y + 66 = 0
⇒ 8x = 10y − 66 ⇒ 10y = 8x + 666
10 66 8 66
⇒ x= y− ⇒y= x+
8 8 10 10
10 8
∴ b xy = ∴ byx =
8 10
(2) ⇒ 40x − 18y − 214 = 0 (2) ⇒ 40x − 18y − 214 = 0
⇒ 40x − 214 = 18y ⇒ 40x = 18y + 214
40 214 18 214
⇒y= y− ⇒ x= y+
18 18 40 40
40 18
∴ byx = ∴ b xy =
18q 40q
∴ r = ± b xy · byx ∴ r = ± b xy · byx
r r
10 40 18 8
r=± · r=± ·
8 18 40 10
r = ±2.777 r = ±0.6
204 MA6453 Probability and Queueing Theory

W.K.T. 0 ≤ r ≤ 1.

∴ r = ±0.6

Example 2.43. The two regression lines are 4x − 5y + 33 = 0 and


20x − 9y = 107. Find the means of x and y. Also find the value of r.
Solution : Given

4x − 5y + 33 = 0 (A)
20x − 9y = 107 (B)

Since both the lines of regression passes through the mean values
x and y, the point (x, y) must satisfy the two given regression lines.

4x − 5y = −33 (1)
20x − 9y = 107 (2)

By simplifying (1) and (2),

x = 13, y = 17

Choose equation (A) is the equation of the line of regression of y on x.


4 33 4
(A) ⇒ 5y = 4x + 33 ⇒ y = x + ⇒ byx =
5 5 5
Choose equation (B) is the equation of the line of regression of x on y.
9 107 9
(B) ⇒ 20x = 9y + 107 ⇒ x = y+ ⇒ b xy =
20 20 20
We know that

r2 = b xy b xy
9 4
=
20 5
9
=
25
3
r = = 0.6
5
Example 2.44. If y = 2x − 3 and y = 5x + 7 are the two regression
lines, find the mean values of x and y. Find the correlation
coefficient between x and y. Find an estimate of x when y = 1.
10 29 √
Solution : {Mean values x = − , y = − , r = 2/5, for y = 1, x = −6/5}
3 3
Two - Dimensional Random Variables 205

2.8.3 Examples of Regression curve lines of continuous r.v.s

Example 2.45. The joint pdf of X and Y is given by


x + y
, 0 ≤ x ≤ 1, 0 ≤ x ≤ 2



f (x, y) = 

 3 . Find
0, otherwise

1. The correlation coefficient.

2. The two lines of regression.

3. The two regression curves for means.

Solution :
(i)

Z∞ Z∞
f (x) = f (x, y)dy f (y) = f (x, y)dx
−∞ −∞
Z2 Z1
1 1
= (x + y)dy = (x + y)dy
3 3
0 0
#1
2 2 1 x2
" # "
1 y = + xy
= xy + 3 2
3 2 0 " # 0
1 1 1
= [2x + 2] = +y
3 3 2
2 1
f (x) = [x + 1], 0 ≤ x ≤ 1 f (y) = [1 + 2y], 0 ≤ y ≤ 2
3 6

Z∞ Z∞
E[X] = x f (x)dx E[Y] = y f (y)dy
−∞ −∞
Z1 Z2
2 1
= (x + 1) dx
x = (1 + 2y) dy
y
3 6
0 0
#1 #2
2 x3 x2 1 y2 y3
" "
= + = +2
3 3 2 0 6 2 3
" # " # 0
2 1 1 10 1 16
= + = = 2+
3 3 2 18 6 3
5 11
E[X] = E[Y] =
9 9
206 MA6453 Probability and Queueing Theory

Z∞ Z∞ Z∞ Z∞
E[XY] = xy f (x, y)dxdy (or) xy f (x, y)dydx
−∞ −∞ −∞ −∞
Z1 Z2 Z1 Z2
1 1  
= xy (x + y)dydx = x2 y + xy2 dydx
3 3
0 0 0 0
Z1 #2
x2 y2 xy3
"
1
= + dx
3 2 3 0
0
Z1 " #
18
=
2x2 + x dx
33
0
#1
1 2 3 8 x2
"
= x +
3 3 32 0
" # " #
1 2 4 1 6
= + =
3 3 3 3 3
2
∴ E[XY] =
3
" #" #
2 5 11 2 55
Cov[X, Y] = E[XY] − E[X] · E[Y] = − = −
3 9 9 3 81
−1
Cov[X, Y] =
81

Z∞ ∞
h i h i Z
E X2 = x2 f (x)dx E Y2 = y2 f (y)dy
−∞ −∞
Z1 Z1
2 1
= x2 (x + 1) dx = y2 (1 + 2y) dy
3 6
0 0
Z1  Z1 
2  1 
= x3 + x2 dx = y2 + 2y3 dy
3 6
0 0
3 1
#2
2 x4 x 1 y3 y4
" # "
= + = +2
3 4 3 0 6 3 4
" ! # " ! 0 #
2 1 1 1 8
= + − (0 + 0) = + 8 − (0 + 0)
3 4 3 6 3
7 16
= =
18 9
Two - Dimensional Random Variables 207

σ2X = Var[X] = E[X 2 ] − (E[X])2 σ2Y = Var[Y] = E[Y 2 ] − (E[Y])2


!2 !2
7 5 16 11
= − = −
18 9 9 9
7 25 16 121
= − = −
18
r 81 r 81
9
13 23
σX = σY =
162 81

Correlation co-efficient is

Cov(x, y)
r(X, Y) =
σ x σy
1

= √ 81
13162
r
23
81
r
2
r(x, y) = −
299

(ii) The lines of regression are

The line of regression of X on Y is The line of regression of Y on X is

rσX rσY
X − E(X) = [Y − E(Y)] Y − E(Y) = [X − E(X)]
σY σX
r r
r 13 r 23
" # " #
5 2 162 11 11 2 81 5
X− =− × r Y− Y− =− × r X−
9 299 23 9 9 299 13 9
" # 81 " # 162
5 1 11 11 2 5
X− =− Y− Y− =− X−
9 23 9 9 13 9
1 14 2 17
X=− Y+ Y =− X+
23 23 13 13

(iii) The two regression curves for means


208 MA6453 Probability and Queueing Theory

W.K.T.
f (x, y) (x + y)/3
The conditional probability of X on Y is f (x/y) = =
f (y) (1 + 2y)/6
2(x + y)
=
1 + 2y
f (x, y) (x + y)/3
The conditional probability of Y on X is f (y/x) = =
f (x) 2(x + 1)/3
x+y
=
2(1 + x)

The regression curve of X on Y is The regression curve of Y on X is

X = E[X / Y = y] Y = E[Y / X = x]
Z1 Z2
= x f (x/y) dx = y f (y/x) dy
0 0
Z1 Z2
2(x + y) x+y
= x dx = y dy
1 + 2y 2(1 + x)
0 0
Z1 Z2 
2   1 
= x + xy dx
2
= xy + y 2
dy
1 + 2y 2(1 + x)
0 0
3 2
#1 #2
xy2 y3
" "
2 x x 1
= + y = +
1 + 2y 3 2 0 2(1 + x) 2 3 0
" # " ! #
2 1 y 1 8
= + = 2x + − (0 + 0)
1 + 2y 3 2 2(1 + x) 3
2+y 3x + 4
" #
2
= ∴y=
1 + 2y 6 3(1 + x)
2 + 3y
∴x=
3(1 + 2y)
Two - Dimensional Random Variables 209

2.9 Transformation of random variables:

Y(X = 0) V(U = 0)
31 v = f2 (x, y)
(x, y) v (u, v)
2

1
u = f1 (x, y)
U(V = 0)
0 1 2 30 X(Y = 0) u

XY − plane U V − plane
f (x, y) = Joint pdf of X&Y g(u, v) = Joint pdf of U&V
= fXY (x, y) = gUV (u, v)
f (x) = Marginal pdf of X g(u) = Marginal pdf of U
Z∞ Z∞
= f (x, y)dy = g(u, v)dv
−∞ −∞
= fX (x) = gU (u)
f (y) = Marginal pdf of Y g(v) = Marginal pdf of V
Z∞ Z∞
= f (x, y)dx = g(u, v)du
−∞ −∞
= fY (y) = gV (v)
Let U = f1 (x, y) and V = f2 (x, y)
Find x and y in terms of u and v
∂x ∂x



 ∂u ∂v
, =






 f (x, y) |J| J ∂y ∂y

∂u ∂v





Define g(u, v) = 

 (or)
∂u ∂u





1 ∂x ∂y


, =
 0
f (x, y) J


 ∂v ∂v

|J 0 |




∂x ∂y


Find Range of space of u and v
Z∞
P.D.F. of U = g(u) = g(u, v)dv
−∞
Z∞
P.D.F. of V = g(v) = g(u, v)du
−∞
210 MA6453 Probability and Queueing Theory

(a) Joint probability density function of transformed r.v.s:


Consider a two dimensional random variable (X, Y) having joint
probability density function fXY (x, y). Now let (U, V) be another two
dimensional random variable where the r.v.s U and V are defined as:
U = f1 (x, y) and V = f2 (x, y). Also let the inverse be given by
x = g1 (u, v) , y = g2 (u, v) which are continuous and posses continuous
partial derivatives. Then, the joint p.d.f. of the transformed r.v.s
(U, V) is given by fUV (u, v) = fXY (x, y) |J| where |J| is the modulus of
Jacobian J of (x, y) with respect to (u, v) which is given by
∂x ∂x

∂ (x, y) ∂u ∂v xu xv

J= = =
∂ (u, v) ∂y ∂y yu yv

∂u ∂v

Note : On obtaining the joint p.d.f. fUV (u, v) we can find the marginal
p.d.f.s of U and V as
R∞
fU (u) = f (u) = fUV (u, v) dv
−∞
R∞
fV (v) = f (v) = fUV (u, v) du
−∞

(b) Theorem : If X and Y are independent continuous r.v.s then the p.d.f.
of U = X + Y is given by
R∞
f (u) = fX (v) fY (u − v) dv
−∞

(c) Working rule for finding fUV (u, v) given the j.p.d.f. fXY (x, y) as in
following steps:
(i) Find out the joint p.d.f. of (X, Y) if not already given.
(ii) Consider the new random variables, u = f1 (x, y) , v = f2 (x, y) and
rearrange them to express x and y in terms of u and v to get x =
g1 (u, v) , y = g2 (u, v).
∂x ∂x

∂ (x, y)

∂u ∂v xu xv
(iii) Find the Jacobian J = = ∂y ∂y = and hence
∂ (u, v) yu yv
∂u ∂v
the value of |J|
(iv) Find the j.p.d.f. fUV (u, v) using the formula: fUV (u, v) = fXY (x, y) |J|
and express it in terms of u and v.
Two - Dimensional Random Variables 211

(v) Obtain the range space of U and V by considering the range


spaces of X and Y and the relation U = f1 (x, y) and V = f2 (x, y).
(vi) Write down the expression for fUV (u, v).
(vii) If required, find out the expression for marginal p.d.f.s of U and
V as
R∞
fU (u) = f (u) = fUV (u, v) dv
−∞
R∞
fV (v) = f (v) = fUV (u, v) du
−∞

2.9.1 Examples of Transformation of random variables

Example 2.46. If X and Y are continuous random variables and


U = X + Y. Find p.d.f. of U.
Solution : Given U = X + Y. (1)
Let us define a new variable Let us define a new variable
V=X (2) V=Y (4)
(2) ⇒ x = v (2) ⇒ y = v
(1) ⇒ u = x + y (1) ⇒ u = x + y
i.e., u = v + y [∵ by (2)] i.e., u = x + v [∵ by (4)]
∴y=u−v ∴ x=u−v
g(u, v) = f (x, y) |J| (3) g(u, v) = f (x, y) |J| (5)

0 1 1 −1
where J = = −1 where J = =1
1 −1 0 1
∴ |J| = |−1| = 1 ∴ |J| = |1| = 1
∴ (3) ⇒ g(u, v) = f (v, u − v) · 1 ∴ (3) ⇒ g(u, v) = f (u − v, v) · 1
Z∞ Z∞
∴ g(u) = g(u, v)dv ∴ g(u) = g(u, v)dv
−∞ −∞
Z∞ Z∞
= f (v, u − v)dv = f (u − v.v)dv
−∞ −∞
Find range space of u and v.
Example 2.47. Joint p.d.f. of X&Y is given by f (x, y) = e−(x+y) , x >
X+Y
0, y > 0. Find p.d.f. of .
2
Solution : Given Joint p.d.f. of X&Y is given by
f (x, y) = e−(x+y) , x > 0, y > 0. (1)
212 MA6453 Probability and Queueing Theory

X+Y
Let U = . (2)
2
Define a new variable V = X. (3)
(3) ⇒ v = x ⇒∴ x = v (4)
x+y
∴ (2) ⇒ u =
2
⇒ 2u = v + y ⇒∴ y = 2u − v (5)

Now, joint pdf of (U, V) :


g(u, v) = f (x, y) |J| (6)

0 1
where J = = −2 ⇒ |J| = |−2| = 2
2 −1
∴ g(u, v) = e−(x+y) · 2
= 2e−(v+2u−v)
g(u, v) = 2e−2u
To find range space of u and v:
x>0 y>0
2u − v > 0
v>0 2u > v
i.e., v < 2u
∴ 0 < v < 2u

Z∞
∴ Pdf of U is g(u) = g(u, v)dv V(U = 0)
−∞ v = 2u
Z∞
= 2e−2u dv
−∞
Z2u 0 < v < 2u
= 2e −2u
du
0
U(V = 0)
= 2e−2u · 2u
= 4ue−2u , u > 0

Example 2.48. If X and Y are independent exponential random


variates with parameter 1. Find p.d.f. of U = X − Y.
Two - Dimensional Random Variables 213

Solution : Given X and Y are independent exponential random variates


with parameter 1. i.e.,
X ∼ Exp. Dist. (α = 1) Y ∼ Exp. Dist. (α = 1)
Pdf of X : f (x) = αe−αx , x > 0 Pdf of Y : f (y) = αe−αy , y > 0
= e−x , x ≥ 0 = e−y , y ≥ 0
∴ Joint p.d.f. of X&Y is given by
f (x, y) = f (x) · f (y) (∵ X and Y are independent)
= e−x · e−y
= e−(x+y) , x > 0, y > 0. (1)

Given U = X − Y. (2)
Define a new variable V = X. (3)
(3) ⇒ v = x ⇒∴ x = v (4)
∴ (2) ⇒ u = x − y
⇒ u = v − y ⇒∴ y = v − u (5)
Now, joint pdf of (U, V) :
g(u, v) = f (x, y) |J| (6)

0 1
where J = = 1 ⇒ |J| = |1| = 1
−1 1
∴ g(u, v) = e−[x+y] · 1
= e−[v+v−u]
g(u, v) = eu · e−2v
x≥0 y≥0
v−u≥0
To find range space of u and v: v ≥ 0
v≥u
∴u≤0<v
V(U = 0)
u=v

R1 R2

v≥u v>0
U(V = 0)
214 MA6453 Probability and Queueing Theory

Z∞
∴ Pdf of U is g(u) = g(u, v)dv (7)
−∞

In R1 In R2
Z∞ Z∞
∴ (7) ⇒ g(u) = eu e−2v dv ∴ (7) ⇒ g(u) = eu e−2v dv
"0 −2v #∞ "u −2v #∞
e e
= eu = eu
−2 0 −2 u
e−2u
" # " #
1
= e (0) −
u
= e (0) −
u
−2 −2
u −u
e e
= ,u < 0 = ,u ≥ 0
2 2

 eu
 2 ,u < 0



∴ The required pdf of U : g(u) = 

e−u
,u ≥ 0




2

Example 2.49. If X and Y are independent uniform random


variates on [0, 1]. Find p.d.f. of U = XY.
Solution : Given X and Y are independent uniform random variates
defined on [0, 1]. i.e.,

X ∼ Uni. Dist.(a = 0, b = 1) Y ∼ Uni. Dist.(a = 0, b = 1)


Pdf of X : Pdf of Y :
1 1
f (x) = = 1, x > 0 f (y) = = 1, y > 0
1−0 1−0

∴ Joint p.d.f. of X&Y is given by

f (x, y) = f (x) · f (y) (∵ X and Y are independent)


=1·1
= 1, x > 0, y > 0. (1)

Given U = XY. (2)


Two - Dimensional Random Variables 215

Define a new variable V = Y. (3)

(3) ⇒ v = y ⇒∴ y = v (4)
∴ (2) ⇒ u = xy
u
⇒ u = xv ⇒∴ x = (5)
v

Now, joint pdf of (U, V) :

g(u, v) = f (x, y) |J| (6)



1 −u
where J = v v2 = −2 ⇒ |J| = |−2| = 2
0 1
1
∴ g(u, v) = 1 ·
v
1
=
v
0<x<1 0<y<1
u
0< <1
To find range space of u and v: v 0<v<1
0<u<v
∴0<u<v<1
V(U = 0)
u=v
Z∞
v=1
∴ Pdf of U is g(u) = g(u, v)dv v=1
−∞
Z1
1 v=u
= dv U(V = 0)
v
Zu
= [log v]1u
= [(log 1) − (log u)]
= − log u, u > 0
Example 2.50. If X and Y are independent uniform random
X
variates on [0, 1]. Find p.d.f. of U = .
Y
Solution : Given X and Y are independent uniform random variates
defined on [0, 1]. i.e.,
216 MA6453 Probability and Queueing Theory

X ∼ Uni. Dist. (a = 0, b = 1) Y ∼ Uni. Dist. (a = 0, b = 1)


1 1
Pdf of X : f (x) = = 1, x > 0 Pdf of Y : f (y) = = 1, y > 0
1−0 1−0

∴ Joint p.d.f. of X&Y is given by

f (x, y) = f (x) · f (y) (∵ X and Y are independent)


=1·1
= 1, x > 0, y > 0. (1)

X
Given U = . (2)
Y

Define a new variable V = X. (3)

(3) ⇒ v = X ⇒∴ x = v (4)
x
∴ (2) ⇒ u =
y
v v
⇒ u = ⇒∴ y = (5)
y u

Now, joint pdf of (U, V) :

g(u, v) = f (x, y) |J| (6)



0 1 −v
−v v
where J = −v 1 = 2 ⇒ |J| = 2 = 2
u u u
u2 u
v
∴ g(u, v) = 1 · 2
u
v
= 2
u
Two - Dimensional Random Variables 217

0<y<1 0<x<1
v
0< <1
To find range space of u and v: 0 < v < 1 u
0<v<u
∴0<v<u<1
Z∞
∴ Pdf of U is g(u) = g(u, v)dv
−∞ V(U = 0)
Z1 u=v
1
= dv
v v=1
Zu
= [log v]1u R1 R2
U(V = 0)
= [(log 1) − (log u)]
= − log u, u > 0
Example 2.51. If X and Y are independent random √ variables with  
mean ‘0’ and variance σ . Find p.d.f. of R = X + Y , φ = tan
2 2 2 −1 Y
X
.
Solution : Given X and Y are independent random variables with mean
‘0’ and variance σ2 . i.e.,
   
X ∼ Nor. Dist. µ = 0, σ 1
Y ∼ Nor. Dist. µ = 0, σ 1

1 x−µ 2
  1  y − µ 2
1 − 1 −
f (x) = √ e 2 σ ; f (y) = √ e 2 σ ;
σ 2π σ 2π
−∞ < x < ∞ −∞ < y < ∞
x−µ y−µ
z= ; −∞ < µ < ∞ z= ; −∞ < µ < ∞
σ σ
σ>0 σ>0
Joint pdf of X and Y :
 !2   !2 
 1 y − 0   1 y − 0 
 1 −   1 −
σ σ

f (x, y) = f (x) · f (y) =  √ e
 2  ·  √ e
  2 
 σ 2π   σ 2π
  

 
1 x +y
2 2

1 −
= 2
e 2 σ2 ; −∞ < x, y < ∞ (1)
2πσ
Given

R= X2 + Y 2 (2)
Y
φ = tan−1 (3)
X
218 MA6453 Probability and Queueing Theory

(2) ⇒ a circle of radius R and argument φ.


Since the given transformations R and φ are in polar form of (x, y). So
define

x = R cos φ (3)
y = R sin φ (4)

g(R, φ) = f (x, y) |J| (6)


∂x ∂x

∂R ∂φ cos φ sin φ

where J = ∂y ∂y = = R cos2 φ + R sin2 φ = R ⇒ |J| = R
−R sin φ R cos φ
∂R ∂φ
∴ g(R, φ) = f (x, y) · |J|
 
1 x +y
2 2

1 −
= 2
e 2 σ2 ·R
2πσ  
2
1 R
R −
= 2
e 2 σ2 (∵ by(1))
2πσ

−∞ < x, y < ∞
To find range space of R and φ:
0 ≤ φ ≤ 2π and 0 ≤ R < ∞

V(U = 0)
R=0 R→∞

U(V = 0)

0 ≤ φ ≤ 2π
Two - Dimensional Random Variables 219

∴ Pdf of R is : ∴ Pdf of φ is :
Z∞
g(φ) = g(R, φ)dR
−∞
 
2
Z∞ Z∞ 1 R
R −
g(R) = g(R, φ)dφ = e 2 σ2 dR
2πσ2
−∞ 0
   
2 2
Z2π 1 R Z∞ 1 R
R − R −
= e 2 σ2 dφ = Re 2 σ2 dR
2 2πσ 2
2πσ
0 0
R2
 
2 2RdR
1 R Take t = ⇒ dt =
R − 2σ2 2σ2
= 2
e 2 σ2 [φ]2π
0 Z∞
2πσ 1
∴ g(φ) = σ2 e−t dt
 
2
1 R 2πσ 2
R − " −t0 #∞
= e 2 σ2 [2π]
2
2πσ   1 e
2 =
1 R 2π −1 0
R − 1 h −∞   0 i
∴ g(R) = 2 e 2 σ2 =− e − e
σ 2π
1
= − [(0) − (1)]

1
∴ g(φ) = , 0 ≤ φ ≤ 2π

Potrebbero piacerti anche