Sei sulla pagina 1di 1

DEPARTMENT OF MATHEMATICS

INDIAN INSTITUTE OF TECHNOLOGY GUWAHATI


MA225 Probability Theory and Random Processes July - November 2017
Problem Sheet 10 NS

1. Consider the experiment of tossing a coin repeatedly. Let 0 < p < 1 be the probability of head in
a single trial. Check whether the following processes are Markov processes or not. If they are, then
find the state transition probabilities and draw the state transition probability diagram. Also, find the
initial distribution. Find the communicating classes and classify the states according to transient and
recurrent states.

(a) For all n ∈ N, let Xn = 1 if the nth trial gives a head, Xn = 0 otherwise.
(b) For all n ∈ N, let Yn be the number of heads in the first n trials.
(c) For all n ∈ N, let Zn = 1 if both the (n − 1)th and nth trial result in heads, Zn = 0 otherwise.
(d) For all n ∈ N, let Un be the number of heads minus the number of tails in the first n trials.
(e) For all n ∈ N, let Vn be the number of tails which occurred after the last head in the first n trials.

2. Suppose that {Xn , n ≥ 0} is a stochastic process on S (state space) which is of the form Xn =
f (Xn−1 , Yn ), n ≥ 1, where f : S × S 0 → S and Y1 , Y2 , . . . are IID random variables with values in
S 0 that are independent of X0 . Then show that {Xn } is a Markov chain with transition probabilities
pij = P {f (i, Y1 ) = j}.

3. A commodity is stocked in a warehouse to satisfy continuing demands and the demands D1 , D2 , . . . in


time periods 1, 2, . . . are IID non-negative integer-valued random variables. The inventory is controlled
by the following (s, S) inventory-control policy, where s < S are pre-determined integers. At the end
of period n − 1, the inventory level Xn−1 is observed and one of the following actions is taken: (a)
Replenish the stock (instantaneously) up to the level S if Xn−1 ≤ s (b) Do not replenish the stock
if Xn−1 > s. Assume X0 ≤ S for simplicity and that X0 is independent of Dn ’s. Write down
the recursion relation for Xn ’s and show that {Xn } is a Markov chain and determine the transition
probabilities if Dn ’s follow a Poisson distribution with parameter λ.
(n)
4. Consider the simple random walk and assume that the process starts at the origin. Determine pij .

 the Markov chain {X


5. Consider  n } with state space S = {0, 1, 2, 3} and transition probability matrix
0.8 0 0.2 0
 0 0 1 0 
P =   1
 . Find the communicating classes and classify the states according to
0 0 0 
0.3 0.4 0 0.3
transient and recurrent states.

6. Let {X n , n ≥ 0} be a Markov


 chain with three states 0, 1, 2 and with transition probability matrix
1/2 1/2 0
P =  1/3 1/3 1/3  and the initial distribution P (X0 = i) = 31 , i = 0, 1, 2. Compute P {X2 =
0 1/2 1/2
2, X0 = 1}, P {X1 6= X0 }, P {X3 = 1|X1 = 0}, P {X2 = 2, X1 = 1|X0 = 2}, P {X2 = 1} and
P {X0 = 1|X1 = 1}. Now, draw the state transition diagram and find the communicating classes. Is
the chain reducible, recurrent (null or non-null)? What are the periods of the states? Determine the
stationary distribution. Is this the same as the limiting distribution? If so, why?

Potrebbero piacerti anche