Sei sulla pagina 1di 5

Name:

Stats 217 Section 1


Midterm Exam

February 10, 2015

Write your name and sign the Honor code in the blue books provided.
Instructions: Please write your name on this question paper and hand it back together
with your answer booklet. This is an open material exam. You have 75 minutes to solve
all questions, each worth points as marked (maximum of 50 points). Complete reasoning is
required for full credit, i.e., outline of solutions is not sucient in order to get maximum
points. Try to show your working, so that if you make a mistake you will at least get partial
credit.
You may cite texts, lecture notes and homework sets, as needed, stating precisely the
result you use, why and how it applies. Note that the level of detail provided in solutions
to past midterms are only sketches, and provided for the benefits of students; more detailed
answers are required.
(30) 1. Suppose a Markov chain has state space S = {0, 1, 2, 3, 4} and transition matrix
0
1
P=
2
3
4

0
1
2
3
4
0.5 0 0.4 0 0.1
0 0.3 0
0 0.7
0.4 0 0.6 0
0
0
0
0
1
0
0.8 0
0 0.2 0

(2)

(a) Is this matrix stochastic? Why or why not?


Is this matrix doubly stochastic? Why or why not?
This matrix is indeed stochastic, since all entries are nonnegative and all row sums
are 1. However, it is not doubly stochastic since the first column does not sum to
1.

(3)

(b) Compute p2 (0, 2). (It may help to draw the state diagram.)
Instead of computing the powers of P, an easier approach is to note that pn (x, y)
is the sum over all paths of length n from x to y, of the products of the edgetransition probabilities. (This was done in class.) Thus, we first draw the state
diagram, and then compute: to go from 0 to 2 in two steps, the only possible paths
are 0 0 2 and 0 2 2. Thus,
p2 (0, 2) = p(0, 0)p(0, 2) + p(0, 2)p(2, 2) = p(0, 2)(p(0, 0) + p(2, 2))
= 0.4(0.5 + 0.6) = 0.4 1.1 = 0.44.

(3)

(c) Classify each state as transient, null recurrent, or positive recurrent (with reasons).
It is clear that all states lead to 3, which is absorbing and hence positive recurrent
by definition (alternatively, use Corollary 4.13 in the notes). Since 3 does not lead
to any j = 3, a result in class implies that all other states j = 3 are transient.

(3)

(d) Compute all of the stationary distributions.


By results in class, there is a unique stationary distribution because there is a unique
positive recurrent class. Moreover, this class consists of one absorbing state {3}, so
the only probability distribution possible on this class is (j) = 1(j = 3).

(3)

(e) Compute the period of each state j = 1.


For all j = 1, the communicating class of j contains a state with a self-loop,
which is certainly aperiodic. Since the period is a class property, it follows that all
states j = 1 are aperiodic.

(6)

(f) Identify all maximal irreducible subsets (i.e., communication classes) of S, as well
as all nonempty closed subsets of S.
The maximal irreducible sets - i.e., the communication classes - are clearly: {1},
{3}, and {0, 2, 4}. (Thus, the set of irreducible subsets is all possible subsets of
these sets.) Next, we claim that the closed nonempty subsets of S are precisely:
{3},

{0, 2, 3, 4},

S.

This is because each of these sets is closed and nonempty. Conversely, if C S


is closed and nonempty, then there are three cases which can be easily verified by
inspection:
If 1 C, then C = S.
If 1
/ C but 4 C, then C = {0, 2, 3, 4}.
If 1, 4
/ C, then C = {3}.
(5)

(g) Find Pj (T3 < ) for all j = 0, 1, 2, 3, 4.


This requires solving a system of linear equations, which are set up using first-step
analysis. Define uj := Pj (T3 < ) for j = 0, 1, . . . , 4.
Using the Markov property and first-step analysis (as in the homework), the equations for the uj are:
u1
u2
u3
u4
u0

=
=
=
=
=

0.3
0.4
1
0.8
0.5

u1 + 0.7 u4
u0 + 0.6 u2
u0 + 0.2 u3
u0 + 0.4 u2 + 0.1 u4

The first two equations imply that u1 = u4 and u0 = u2 . Combining this with
the last equation implies that u0 = u4 . Now substituting this and u3 = 1 into the
penultimate equation implies that u4 = 1. Hence uj = 1 j S.
Page 2

(5)

(h) Compute lim pn (j, k) for all j, k S.


n

First suppose k = 3. Then k is transient, whence limn pn (j, k) = 0 by Theorem


4.1(1) in the notes. Next if j = k = 3, then 3 is an absorbing state and hence
limn pn (3, 3) = limn 1 = 1. Finally, if k = 3 and j = 3, then by Theorem
4.1(2) in the notes, as well as the previous part,
lim pn (j, 3) = fj,3 3 = Pj (T3 < ) 1 = 1.

(5) 2. Given a (time-homogeneous, discrete time discrete space) Markov chain Xn , show that
the subsequence {X0 , X3 , X6 , X9 , . . . } also forms a Markov chain.
Solution 1: Let Yn = X3n for n 0. We now verify using the property (M4) for Xn ,
that the Markov property (M1) holds for Yn :
P(Yn = in |Y0 = i0 , . . . , Yn1 = in1 )
= P(X3n = in |X0 = i0 , X3 = i1 , X6 = i2 , . . . , X3(n1) = in1 )
= (M 4) = P(X3n = in |X3(n1) = in1 ) = P(Yn = in |Yn1 = in1 ).
Solution 2: By HW1 Q11 with k = 3, Yn is a Markov chain. (Or repeat the solution
of that question, which is essentially Solution 1 above.)
(15) 3. Suppose Xn denotes a symmetric simple random walk. In this question we will classify
all of the stationary vectors.
(5)

(a) Suppose is stationary with (0) = 0. Show that (n) = n(1) for all n > 0. (One
can similarly show that (n) = n(1) for all n < 0.)
Solution 1:
First solve by induction: (1) = (P)(1) = (0)p(0, 1) + (2)p(2, 1) = (2) 12 ,
whence (2) = 2(1). Now by induction, if (k) = k(1) for all 1 k n, then
1
1
(n) = (P)(n) = (n 1) + (n + 1) ,
2
2
which yields:
1 1
n(1) = (n1)(1) + (n+1)
2 2

(n+1) = [2n(n1)](1) = (n+1)(1).

This shows the result by induction on n > 0.


Solution 2: The equations P = yield: (n) = p(n 1, n)(n 1) + (n +
1)p(n + 1, n), which simplifies to:
(n + 1) (n) = (n) (n 1),

Page 3

for all integers n. In particular, each of these dierences equals (1) (0). Now
adding these dierences yields a telescoping sum for n > 0:
(n) (0) =

n1
!

n1
!
(i + 1) (i) =
((1) (0)) = n((1) (0)).

i=0

i=0

This in fact proves part (c), since (n) = (0) + n((1) (0)) for n > 0 (and a
similar proof shows the same statement for n < 0). Moreover, part (a) is a special
case with (0) = 0, which yields: (n) = n(0), as desired.
(2)

(b) Show that the vector = 1 = (. . . , 1, 1, 1, . . . ) (alternatively, (n) = 1 for all


integers n) is a stationary vector.
(P)(n) = (n 1)p(n 1, n) + (n + 1)p(n + 1, n) = 1

1
1
+ 1 = 1 = (n).
2
2

Solution 2: Use the Detailed Balance Conditions: (x)p(x, n) and (n)p(n, x) are
both zero unless x = n1, in which case p(x, n) = p(n, x) = 1/2 and (n) = (x) =
1. Thus = 1 satisfies the Detailed Balance Conditions, hence is a stationary vector
(from the discussion in our classroom).
(4)

(c) Now show that every stationary vector is of the form (n) = (0)+n((1)(0))
for all n. (You are allowed to assume the previous part - i.e., if (0) = 0 then
(n) = n(1) for all integers n.)
Hint: First show that = (0)1 is a stationary vector, which equals zero at 0.
Solution 1: If is stationary, define a new vector via: (n) = (n) (0) i.e., = 1. Then from above,
( P) = (P) (0)(1 P) = (0)1 = ,
whence is a stationary vector with
(0) = 0,

(1) = (1) (0).

It follows by the previous part that (n) = n (1) = n((1) (0)) for all integers
n. Therefore,
(n) = (n) + (0) = (0) + n((1) (0)).
Solution 2: See Solution 2 for part (a).
Solution 3: Use reasoning similar to Solution 1 for part (a) - i.e., by induction
on n > 0. (The proof is similar for n < 0.) For n = 0, 1, we have:
(0) + 0((1) (0)) = (0),

(0) + 1((1) (0)) = (1).

Now suppose the result holds for n 1, n for some n 1. The equations P =
yield: (n) = p(n 1, n)(n 1) + (n + 1)p(n + 1, n), which simplifies to:
(n + 1) = 2(n) (n 1).
Page 4

Now using the induction hypothesis,


(n + 1) = 2((0) + n((1) (0))) ((0) + (n 1)((1) (0)))
= 2(0) (0) + (2n (n 1))((1) (0)) = (0) + (n + 1)((1) (0)).
This proves the result by induction.
(4)

(d) Finally, show that there are no stationary distributions for the symmetric simple
random walk. (You may use previous parts if required, without proof.)
Solution 1: Suppose is a stationary vector. First suppose (0) = (1). Then the
previous part implies that there exists an integer n (either positive or negative), such
that (n) = n((1) (0)) + (0) < 0. Thus cannot be a stationary distribution.
If on the other hand (0) = (1), then (n) = (0) for all n, which cannot yield a
probability distribution. This is because there does not exist a uniform probability
distribution on a countably infinite set.
Solution 2: By results in class and HW4, the symmetric simple random walk in
one dimension is null recurrent, since it is recurrent and E0 [T0 ] = . Therefore
there does not exist a stationary distribution.
Solution 3: Suppose is a stationary distribution, which necessarily
" has the form
(n) = (0) + n((1) (0)). Moreover, the sequence cN = N
n=N (n) must
converge in this case, and converge to 1. (Strictly "
speaking, to talk
about the
"1
N
convergence of an infinite series, one must say that n=0 (n) and n=N (n)
both converge as N , and their sum equals 1.)
"
"N
"N
Now note that N
n=N (n) =
n=N (0) + ((1) (0))
n=N
"n, which cannot
converge unless both (0) and ((1) (0)) are zero. But then n (n) = 0, and
is not a stationary distribution.

Page 5

Potrebbero piacerti anche