Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
1
1. Let X and Y be random variables with the joint pmf P {X = i, Y = j} = N2
, i, j =
1, 2, · · · , N . Find (a) P {X ≥ Y } (b) P {X = Y }.
Solution: (a)
X
P {X ≥ Y } = f (x, y)
(x,y):x≥y
N X
X N
= f (x, y)
y=1 x≥y
N X N
X 1
=
y=1 x≥y
N2
N N
1 XX
= 1.
N 2 y=1 x≥y
" N N N N
#
1 X X X X
= 2 1+ 1+ 1 + ··· 1
N x=1 x=2 x=3 x=N
1
= [N + (N − 1) + (N − 3) + · · · 1]
N2
1 N (N + 1)
= 2×
N 2
1 1
= +
2 2N
(b)
X
P {X = Y } = f (x, y)
(x,y):x=y
N
X
= f (y, y)
y=1
N
X 1
=
y=1
N2
1
=
N
If 0 ≤ y < x then
F (x, y) = 1 − e−λy (1 + λy)
In second, third and fourth quadrant F (x, y) = 0.
3. Let X, Y be continuous random variables with joint density function
−y
e (1 − e−x ) if 0 < x < y < ∞
fX,Y (x, y) =
e−x (1 − e−y ) if 0 < y ≤ x < ∞
Show that X and Y are identically distributed. Also compute E[X] and E[Y ].
Solution:
Z ∞
fX (x) = fX,Y (x, y)dy
−∞
∞
Z
fX,Y (x, y)dy if 0 < x < ∞
= 0
0 otherwise
Therefore,
Similarly,
Z ∞
fY (y) = fX,Y (x, y)dx
−∞
∞
Z
fX,Y (x, y)dx if 0 < y < ∞
= 0
0 otherwise
Therefore,
Now
! !
Z 0 Z ∞ Z ∞ Z ∞
f (x, y)dy dx = f (−x, y)dy dx
z z
−∞ x
0 −x
Z ∞ Z z
u du
= f −x, − dx
0 −∞ x x
Z z Z ∞
1 u
= f −x, − dx du
−∞ 0 x x
z
!
Z ∞ Z Z ∞ Z z
x u du
f (x, y)dy dx = f x, dx
0 −∞ 0 −∞ x x
Z z Z ∞
1 u
= f x, dx du
−∞ 0 x x
Z z Z ∞
u i
1h u
FXY (z) = f −x, − + f x, dx du
−∞ 0 x x x
Z z Z ∞
1 u
= f x, dx du
−∞ −∞ |x| x
Therefore pdf of XY is
Z ∞
1 z
fXY (z) = f x, dx
−∞ |x| x
(b) We first determine the distribution function of Z := X − Y . For z ∈ R, set
Az = {(x, y) ∈ R2 |x − y ≤ z}
Now
FZ (z) = P {X − Y ≤ z}
= P {(X, Y ) ∈ Az }
ZZ
= f (x, y)dxdy
Az
Z ∞ Z ∞
= f (x, y)dy dx
−∞ x−z
Z ∞ Z −∞
= −f (x, x − s)ds dx (put y = x − s )
−∞ z
Z ∞ Z z
= f (x, x − s)ds dx
−∞ −∞
Z z Z ∞
= f (x, x − s)dx ds
−∞ −∞
Therefore pdf of X − Y is
Z ∞
fY −X (z) = f (x, x − z)dx
−∞
Az = {(x, y) ∈ R2 ||y − x| ≤ z}
If z < 0 then {(X, Y ) ∈ Az } = ∅. Therefore FZ (z) = 0 if z < 0. Now for z ≥ 0
FZ (z) = P {|Y − X| ≤ z}
= P {(X, Y ) ∈ Az }
ZZ ZZ ZZ
= f (x, y)dxdy = f (x, y)dxdy + f (x, y)dxdy
Az {y−x≥0}∩Az {y−x<0}∩Az
Z ∞ Z x+z Z ∞ Z x
= f (x, y)dy dx + f (x, y)dy dx
−∞ x −∞ x−z
Z ∞ Z z
= f (x, x + s)ds dx (put y = x + s )
−∞ 0
Z ∞ Z 0
+ −f (x, x − s)ds dx (put y = x − s )
−∞ z
Z ∞ Z z
= [f (x, x + s) + f (x, x − s)]ds dx
−∞ 0
Z z Z ∞
= [f (x, x + s) + f (x, x − s)]dx ds
0 −∞
Z ∞
[f (x, x + z) + f (x, x − z)]dx if z ≥ 0
f|Y −X| (z) = −∞
0 ; otherwise
5. Suppose X and Y be random variables that assume four values 1, 2, 3, 4. Their joint
probabilities are given by the following table.
HH Y
HH
1 2 3 4
X HH
1 1 1
1 20 20 20
0
1 2 3 1
2 20 20 20 20
1 2 3 1
3 20 20 20 20
1 1 1
4 0 20 20 20
Solution: First of all the values taken by random variable Z := X + Y are {2, 3, 4, 5, 6, 7, 8}.
Hence pmf of Z is
1
fZ (2) = P (X + Y = 2) = P (X = 1, Y = 1) =
20
2
fZ (3) = P (X + Y = 3) = P (X = 1, Y = 2) + P (X = 2, Y = 1) =
20
4
fZ (4) = f (2, 2) + f (1, 3) + f (3, 1) =
20
6.
where
0 if x ≤ 0 0 if y < 0
1
FX (x) = x if 0 ≤ x ≤ 1 , FY (y) = if 0 ≤ y < 1 ,
2
1 if x ≥ 1 1 if y ≥ 1
Hence
0 if x ≤ 0 or y < 0
x
if 0 < x ≤ 1 and 0 ≤ y < 1
2
1 if x ≥ 1 and y = 0
F (x, y) = ,
x if 0 ≤ x ≤ 1 and y ≥ 1
1 if x ≥ 1 and y ≥ 1
1
2
if x ≥ 1 and 0 ≤ y < 1
= n
X λk λn−k
e−λ1 1 × e−λ2 2
k=0
k! (n − k)!
n−m
λm
1 λ2
m! (n−m)!
= n
X λk1 λ2n−k
×
k=0
k! (n − k)!
λm λn−m
n! m!1 (n−m)!
2
= n
X λk1 λ2n−k
n! ×
k=0
k! (n − k)!
λm λn−m n
n−m m
n! m!1 (n−m)!
2
λ λ2
= n
= m 1
(λ1 + λ2 ) (λ1 + λ2 )n
9. Let X and Y be independent and identically distributed random variables such that
1
P (X = 0) = P (X = 1) = . Show that X and |X − Y | are independent.
2
Solution: Set W = |X − Y |, Then W take values {0, 1} with pmf P (W = 0) = 21 , P (W = 1) =
1
2
. Note that X and W both are Bernoulli(0.5).
P (X = 0, W = 0) = P (X = 0, |X − Y | = 0) = P (X = 0, Y = 0)
P (X = 0, W = 1) = P (X = 0, |X − Y | = 1) = P (X = 0, Y = 1)
Yes, X and W are independent.
10. Let X and Y be two random variables. Suppose that var(X) = 4, and var(Y ) = 9.
If we know that the two random variables Z = 2X − Y and W = X + Y are
independent, find ρ(X, Y ).
Solution: Since independent random variables are uncorrelated, therefore we have
0 = cov(Z, W ) = cov(2X − Y, X + Y ) = cov(2X − Y, X) + cov(2X − Y, Y )
= cov(2X, X) + cov(−Y, X) + cov(2X, Y ) + cov(−Y, Y )
= 2cov(X, X) − cov(Y, X) + 2cov(X, Y ) − cov(Y, Y ) = 2var(X) + cov(X, Y ) − va
= 8 + cov(X, Y ) − 9 =⇒ cov(X, Y ) = 1
1 1
ρ(X, Y ) = √ =
4×9 6
11. Suppose X and Y are two independent random variables such that EX 4 = 2, EY 2 =
1, EX 2 = 1, and EY = 0. Compute var(X 2 Y ).
Solution:
2
var(X 2 Y ) = E (X 2 Y )2 − E X 2 Y
2
= E X 4Y 2 − E X 2Y
2
= E[X 4 ]E[Y 2 ] − E X 2 E[Y ]
= 2 × 1 − (1 × 0)2 = 2
12. The speed of a typical vehicle that drives past a police radar is modeled as an
exponentially distributed random variable X with mean 50 miles per hour. The
police radar’s measurement Y of the vehicle’s speed has an error which is modeled
as a normal random variable with zero mean and standard deviation equal to one
tenth of the vehicle’s speed. What is the joint PDF of X and Y ?
1 1
Solution: Recall that if X ∼ exp(λ) then E[X] = . So we have X ∼ exp . Therefore
λ 50
the pdf of X is
1 −x
50
e 50 if x ≥ 0
fX (x) =
0 if x < 0
Also, conditioned
x on X = x, the measurement Y has normal density with mean 0
2
and variance . Therefore,
10
2
1 − yx 2 10 y2
fY |X (y|x) = x √ e 2( 10 )
= √ e− 50x2 for all x > 0, y ∈ R
10
2π x 2π
Hence the joint pdf of X and Y is
f (x, y) = fY |X (y|x)fX (x)
10 1 x y2
√ × e− 50 e− 50x2 if x > 0
= x 2π 50
0 if x ≤ 0
√
3 − x2 −xy+y2
13. Let X and Y have the joint density f given by f (x, y) = e 2 , x, y ∈ R.
4π
Find the conditional density of Y given X.
3 − 3x2 √ 1 2 √2
= e 8 2π = 2 √ e 3
4π √ 2π
3
4
that is X ∼ N 0, . Now
3
√ 2 2
3 − x −xy+y 2
x − xy + y 2 3x2
f (x, y) 4π
e 2 1
fY |X (y|x) = = √ 2√ = √ exp − +
fX (x) 3 − 3x8
e 2π 2π 2 8
4π
1 1 2 y 2 xy
1 1 1 x 2
= √ exp − x + − = √ e− 2 (y− 2 )
2π 2 4 2 2 2π
In other words, the conditional density of Y given X = x is the normal density with
x
mean and variance 1.
2
14. Let X and Y be continuous random variables having a joint density f . Suppose that
Y and φ(X)Y have finite expectation. Show that
Z ∞
E[φ(X)Y ] = φ(x)E[Y |X = x]fX (x)dx
−∞
Solution:
Z ∞ Z ∞ Z ∞ Z ∞
E[φ(X)Y ] = φ(x)yf (x, y)dxdy = φ(x)yfY |X (y|x)fX (x)dxdy
Z−∞
∞
−∞
Z ∞ −∞ −∞
Z ∞
= φ(x)fX (x) yfY |X (y|x)dy dx = φ(x)fX (x)E[Y |X = x]dx
−∞ −∞ −∞
15. Let X and Y be two independent Poisson distributed random variables having pa-
rameters λ1 and λ2 respectively. Compute E[Y |X + Y = z] where z is a nonnegative
integer.
P (Y = m, X + Y = z) P (Y = m, X = z − m)
fY |X+Y (m|z) = P (Y = m|X + Y = z) = =
P (X + Y = z) P (X + Y = z)
λz−m λm
P (Y = m)P (X = z − m) e−λ1 (z−m)!
1
× e−λ2 m!2
= = z
P (X + Y = z) e−(λ1 +λ2 ) (λ1 +λ
z!
2)
z
z−m m
m
λ1 λ2
=
(λ1 + λ2 )z
Therefore
z z z
λz−m
X X 1 λm
m 2
E[Y |X + Y = z] = mfY |X+Y (m|z) = m z
m=0 m=0
(λ1 + λ2 )
z
1 X z!
= m λz−m λm
(λ1 + λ2 ) m=0 m!(z − m)! 1
z 2
z
1 X z!
= m λz−m λm
(λ1 + λ2 ) m=1 m!(z − m)! 1
z 2
z
1 X z!
= z
λz−m
1 λm
2
(λ1 + λ2 ) m=1 (m − 1)!(z − m)!
z
z X (z − 1)!
= λz−m λm
(λ1 + λ2 ) m=1 (m − 1)!(z − m)! 1
z 2
z
z X z − 1 z−m m
= λ λ2
(λ1 + λ2 )z m=1 m − 1 1
z−1
z X z − 1 z−k−1 k+1
= λ1 λ2 (put m = k + 1)
(λ1 + λ2 )z k=0 k
z−1
z − 1 k (z−1)−k zλ2 (λ2 + λ1 )z−1
zλ2 X zλ2
= z
λ2 λ1 = z
=
(λ1 + λ2 ) k=0 k (λ1 + λ2 ) λ1 + λ2
16. Let X and Y have a joint density f that is uniform over the interior of the triangle
with vertices at (0, 0), (2, 0), and (1, 2). Find the conditional expectation of Y given
X.
(1, 2)
y = 2x
y + 2x = 4
(0, 0) (2, 0)
Solution:
Area of triangle with vertices at (0, 0), (2, 0), and (1, 2) is 12 × 2 × 2 = 2. Therefore
the joint density is 12 over the triangle, zero elsewhere. First we need to compute
E[Y |X = x]. So we need conditional density of Y given X. For that, if x ≤ 0 or
x ≥ 2 then f (x, y) = 0 therefore fX (x) = 0. For x ∈ (0, 2)
Z 2x
1
dy = x ; 0<x≤1
Z ∞
fX (x) = f (x, y)dy = 2 Z0
−∞ 1 −2x+4
dy = 2 − x ; 1 < x ≤ 2
2 0
Therefore
1
f (x, y) 2x ; 0 < x ≤ 1, 0 < y < 2x
fY |X (y|x) = = 1
fX (x) ; 1 < x ≤ 2, 0 < y < 4 − 2x
2(2 − x)
17. Let X and Y be iid exponential random variables with parameter λ, and set Z =
X + Y . Find the conditional expectation of X given Z.
Solution: We need to compute the conditional pdf of X given Z which is by definition
(
f (x,z)
fZ (y)
if fZ (z) > 0
fX|Z (x|z) =
0 if fZ (z) = 0
Now
P (Z ≤ z|X = x) = P (X + Y ≤ z|X = x)
= P (x + Y ≤ z|X = x)
= P (x + Y ≤ z) ( ∵ X, Y are indepedent)
= P (Y ≤ z − x)
Z z−x
= fY (y)dy
−∞
Z z
= fY (t − x)dt (put y = t − x)
−∞
Hence
(
fX (x)fY (z−x)
fZ (z)
if fZ (z) > 0
fX|Z (x|z) =
0 if fZ (z) = 0
λe−λx λe−λ(z−x)
λ2 ze−λz
if z > 0, z − x ≥ 0, x ≥ 0
=
0 if z < 0
1
z
if 0 ≤ x ≤ z
=
0 otherwsie
Now
Z ∞ Z z
1 z
E[X|Z = z] = xfX|Z (x|z)dx = x dx =
−∞ 0 z 2
18. Suppose that X and Y are random variables with the same variance. Show that
X − Y and X + Y are uncorrelated.
Solution:
(b) geometric(p) :
∞ ∞ ∞
itX X itk
X
itk k−1 it
X
φX (t) = E e = e P (X = k) = e (1 − p) p = pe eit(k−1) (1 − p)k−1
k=1 k=1 k=1
∞
it
X it k−1 1
= pe e (1 − p) = peit
k=1
1− eit (1 − p)
20. Let X be a random variable such that X and −X has same distribution. Show that
φX (t) ∈ R for all t ∈ R.
Solution:
∞ ∞ ∞ x eit
eit eit
itX
X
itx
X
itx 1 X
2
φX (t) = E e = e P (X = x) = e = = eit
=
x=1 x=1
2x x=1
2 1− 2
2 − eit
22. Find the characteristic function of the random variable X with pdf given by
1 − |x|, if |x| ≤ 1
f (x) =
0, otherwise.
Solution:
Z 1 Z 1 Z 1
itX itx
φX (t) = E e = e (1 − |x|)dx = (1 − |x|) cos tx dx + i (1 − |x|) sin tx dx
−1 −1 −1
Z 1 Z 1
=2 (1 − |x|) cos tx dx + 0 = 2 (1 − x) cos tx dx
0 0
Z 1
sin tx 1 sin tx 1 sin tx sin t sin t cos tx 1
=2 − x − dx =2 − + 2
t 0 t 0 0 t t t t 0
2
= 2 (1 − cos t)
t
φX (t) = φX (t + 2π), ∀t ∈ R
Solution:
φX (t + 2π) = E[ei(t+2π)X ]
= E[eitX ei2πX ]
= E[eitX {cos(2πX) + i sin(2πX)}]
24. Let X and Y be independent, identically distributed random variables. Show that
φX−Y (t) = |φX (t)|2 .
Solution:
25. Let X and Y be independent and identically distributed continuous random variables
with density f . Find P {X 2 > Y }.
Solution:
B = {(x, y) ∈ R2 |x2 > y}
Now
P {X 2 > Y } = P {(X, Y ) ∈ B}
ZZ
= fX (x)fY (y)dxdy
ZZ B
= f (x)f (y)dxdy
B
!
Z ∞ Z x2
= f (x) f (y)dy dx
−∞ −∞
Y
26. Let X and Y be random variables having joint density f . Find the density of .
X
Y n y o y
Solution: Set Z = and Az = (x, y) ∈ R2 ≤ z . Note that, if x < 0, then ≤ z if and
X x x
only if y ≥ xz. Thus
[
Az = {(x, y)|x < 0, y ≥ xz} {(x, y)|x > 0, y ≤ xz}
Consequently,
ZZ
FZ (z) = f (x, y)dxdy
Az
Z 0 Z −∞ Z ∞ Z zx
= f (x, y)dy dx + f (x, y)dy dx
−∞ xz 0 −∞