Sei sulla pagina 1di 10

SIE 321 - Probabilistic Models in Operations Research

Ruiwei Jiang
Spring 2015
Homework 2 Solution

Problem #1 (15pt)
A shop has two identical machines that are operated continuously except when they break down.
Suppose that if the machine is up on day n, it is up on the (n + 1)st day with probability 0.9,
independent of the past. On the other hand, if it is down on the nth day, it stays down on the
(n + 1)st day with probability 0.2, also independent of the past. Define Xn as the number of
machines that are up on day n.
1. (2pt) What is the state space S of stochastic process {Xn , n 0}?
2. (5pt) Explain why {Xn , n 0} is a Markov chain.
3. (4pt+4pt) Write down the transition matrix of {Xn , n 0}, and draw the transition diagram
of {Xn , n 0}.
Solution:
1. The sample space S = {0, 1, 2}, since there an only be 0, 1, or 2 operating machines.
2. According to the definition, {Xn , n 0} is a Markov chain because (i) it is a discrete-time
stochastic process, and (ii) it satisfies the Markovian property, i.e., the distribution of Xn+1
only depends on Xn .
3. To obtain the transition matrix, we discuss the following cases: (note: the following detailed
discussions are only for reference, which you can skip for simplicity once if you can correctly
obtain the transition matrix.)
(a) P(Xn+1 = 0|Xn = 0) = P(both machines stay down) = (0.2)(0.2) = 0.04.
(b) P(Xn+1 = 1|Xn = 0) = P(one machine starts up, one mathine stays down) = (2)(0.2)(1
0.2) = 0.32.
(c) P(Xn+1 = 2|Xn = 0) = P(both machines starts up) = (1 0.2)(1 0.2) = 0.64.
(d) P(Xn+1 = 0|Xn = 1) = P(the up machine shuts down, the down mathine stays down) =
(1 0.9)(0.2) = 0.02.
(e) P(Xn+1 = 1|Xn = 1) = P(both machines stays) + P(the up machine shuts down and
the down machine starts up) = (0.2)(0.9) + (1 0.2)(1 0.9) = 0.26.
(f) P(Xn+1 = 2|Xn = 1) = P(the up machine stays up, the down machine starts up) =
(0.9)(1 0.2) = 0.72.
(g) P(Xn+1 = 0|Xn = 2) = P(both machines shut down) = (1 0.9)(1 0.9) = 0.01.
(h) P(Xn+1 = 1|Xn = 2) = P(one machine stays up, one mathine shuts down) = (2)(0.9)(1
0.9) = 0.18.
(i) P(Xn+1 = 2|Xn = 2) = P(both machines stay up) = (0.9)(0.9) = 0.81.

To sum up, the transition matrix is as follows:

0.04 0.32 0.64


P = 0.02 0.26 0.72 .
0.01 0.18 0.81
Accordingly, the transition diagram is shown in Figure 1.
0.01

0.26

0.02
0.04

0.18

1
0.32

0.81

0.72

0.64

Figure 1: Transition diagram for problem #1

Problem #2 (15pt)
Daves Photography Store stocks a particular model camera that can be ordered weekly. The
demand for this camera is random. It is 0 with probability 0.35, 1 with probability 0.40, 2 with
probability 0.20, and 3 with probability 0.05. At the end of the week, Dave places an order that
is delivered in time for the next opening of the store on Monday. Daves current policy is to order
two (2) cameras if inventory is less than or equal to one (1) and order nothing otherwise. Let Xn
be the number of cameras in inventory at the end of week n.
1. (2pt) What is the state space S of stochastic process {Xn , n 0}?
2. (2pt) Explain why {Xn , n 0} is a Markov chain.
3. (4pt+4pt) Write down the transition matrix of {Xn , n 0}, and draw the transition diagram
of {Xn , n 0}.
4. (3pt) Dave estimates that, at the end of week 0, there is a 60% chance that his inventory will
be empty, a 30% that his inventory will have only one (1) camera, and a 10% chance that his
inventory will have two (2) camera. What is the probability mass function (PMF) for X3 ?
Solution:
1. The state space S = {0, 1, 2, 3} because Xn can only be 0, 1, 2 or 3 due to the ordering policy.

2. {Xn , n 0} is a Markov chain because (i) it is a discrete-time stochastic process, and (ii) it
has the Markovian property, i.e., the distribution of Xn+1 only depends on Xn .
3. Let Dn+1 represent the demand of week n + 1. To obtain the transition matrix, we discuss the
following cases: (note: the following detailed discussions are only for reference, which you can
skip for simplicity once if you can correctly obtain the transition matrix.)
(a) P(Xn+1 = 0|Xn = 0) = P(Dn+1 2) = 0.20 + 0.05 = 0.25.
(b) P(Xn+1 = 1|Xn = 0) = P(Dn+1 = 1) = 0.40.
(c) P(Xn+1 = 2|Xn = 0) = P(Dn+1 = 0) = 0.35.
(d) P(Xn+1 = 3|Xn = 0) = P(Dn+1 = 1) = 0.
(e) P(Xn+1 = 0|Xn = 1) = P(Dn+1 = 3) = 0.05.
(f) P(Xn+1 = 1|Xn = 1) = P(Dn+1 = 2) = 0.20.
(g) P(Xn+1 = 2|Xn = 1) = P(Dn+1 = 1) = 0.40.
(h) P(Xn+1 = 3|Xn = 1) = P(Dn+1 = 0) = 0.35.
(i) P(Xn+1 = 0|Xn = 2) = P(Dn+1 2) = 0.20 + 0.05 = 0.25.
(j) P(Xn+1 = 1|Xn = 2) = P(Dn+1 = 1) = 0.40.
(k) P(Xn+1 = 2|Xn = 2) = P(Dn+1 = 0) = 0.35.
(l) P(Xn+1 = 3|Xn = 2) = P(Dn+1 = 1) = 0.
(m) P(Xn+1 = 0|Xn = 3) = P(Dn+1 = 3) = 0.05.
(n) P(Xn+1 = 1|Xn = 3) = P(Dn+1 = 2) = 0.20.
(o) P(Xn+1 = 2|Xn = 3) = P(Dn+1 = 1) = 0.40.
(p) P(Xn+1 = 3|Xn = 3) = P(Dn+1 = 0) = 0.35.
To sum up, the transition matrix is as follows:

0.25 0.40
0.05 0.20
P =
0.25 0.40
0.05 0.20

0.35
0
0.40 0.35
.
0.35
0
0.40 0.35

Accordingly, the transition diagram is shown in Figure 2.


4. The PMF of X3 is

a(3)

0.25



0.05
= a(0) P 3 = 0.60 0.30 0.10 0
0.25
0.05

0.40
0.20
0.40
0.20

3
0.35
0


0.40 0.35
= 0.157 0.307 0.373 0.163 .

0.35
0
0.40 0.35

That is, P(X3 = 0) = 0.157, P(X3 = 1) = 0.307, P(X3 = 2) = 0.373, and P(X3 = 3) = 0.163.

0.25

0.20

0.20

0.05
0.25

0.40

1
0.40

0.40

0.35

0.35

0.40

0.35

0.35

0.05

Figure 2: Transition diagram for problem #2


Problem #3 (12pt)
An organization has N employees where N is a large number. Each employee has one of three
possible job classifications and changes classifications (independently) every week according to a
Markov chain. Specifically,
If one is in class 1, then she stays in class 1 with prob. 0.7, moves up to class 2 with prob.
0.2, and moves up to class 3 with prob. 0.1.
If one is in class 2, then she moves down to class 1 with prob. 0.2, stays in class 2 with prob.
0.6, and moves up to class 3 with prob. 0.2.
If one is in class 3, then she moves down to class 1 with prob. 0.1, moves down to class 2
with prob. 0.4, stays in class 3 with prob. 0.5.
Find
1. (2pt+3pt+2pt) Model the change of job classifications as a Markov chain. Define the states of
this Markov chain, write down the transition matrix, and draw the transition diagram.
2. (5pt) Suppose one is in class 1 in week 1, what is the probability that she is in class 3 in week
3?
Solution:
1. Let Xn represent the ones class in week n. Then {Xn , n 0} is a Markov chain with transition
matrix

0.7 0.2 0.1


P = 0.2 0.6 0.2 .
0.1 0.4 0.5
and transition diagram shown in Figure 3.

0.1
0.6

0.2
0.7

0.4

2
0.2

0.5

0.2

0.1

Figure 3: Transition diagram for problem #3


2. (Note: you can also compute P 2 to obtain P(X3 = 3|X1 = 1).)
P(X3 = 3|X1 = 1) =

3
X

P(X3 = 3, X2 = i|X1 = 1)

i=1

3
X

P(X3 = 3|X2 = i)P(X2 = i|X1 = 1)

i=1

= (0.7)(0.1) + (0.2)(0.2) + (0.1)(0.5)


= 0.16.

Problem #4 (15pt)
A machine operates continuously except when it breaks down. Suppose that the state of the
machine on day n depends only on its states in days n 1 and n 2. In particular, the probability
of being up on day n is
0.4 if it was up on days n 1 and n 2.
0.7 if it was up on day n 1 and down on day n 2.
0.2 if it was down on day n 1 and up on day n 2.
0.8 if it was down on days n 1 and n 2.
Let Xn be the state of the machine on day n, i.e., Xn = 1 if the machine is up and Xn = 0 if the
machine is down.
1. (2pt) Is {Xn , n 0} a Markov chain and why? If yes, write down its transition matrix and
transition diagram.
2. (2pt+3pt+3pt) Is {(Xn , Xn+1 ), n 0} a Markov chain and why? If yes, write down its transition
matrix and transition diagram.
3. (5pt) Suppose that X0 = X1 = 1. Compute the probability P(X1 = 1, X2 = 1, X4 = 0).
5

Solution:
1. {Xn , n 0} is not a Markov chain because it does not satisfy the Markovian property.
2. {(Xn , Xn+1 ), n 0} is a Markov chain because it is a discrete-time stochastic process satisfying
the Markovian property, i.e., the distribution of (Xn , Xn+1 ) only depends on (Xn1 , Xn ). (Note:
other explanations also make sense as long as they imply both Xn and Xn+1 only depend on
(Xn1 , Xn ).).
To obtain the transition matrix, we discuss the following cases: (note: the following detailed
discussions are only for reference, which you can skip for simplicity once if you can correctly
obtain the transition matrix.)
(a) P(Xn+1 = 1, Xn = 1|Xn = 1, Xn1 = 1) = P(Xn+1 = 1|Xn = 1, Xn1 = 1) = 0.4.
(b) P(Xn+1 = 0, Xn = 1|Xn = 1, Xn1 = 1) = 1 P(Xn+1 = 1|Xn = 1, Xn1 = 1) = 0.6.
(c) P(Xn+1 = 1, Xn = 1|Xn = 1, Xn1 = 0) = P(Xn+1 = 1|Xn = 1, Xn1 = 0) = 0.7.
(d) P(Xn+1 = 0, Xn = 1|Xn = 1, Xn1 = 0) = 1 P(Xn+1 = 1|Xn = 1, Xn1 = 0) = 0.3.
(e) P(Xn+1 = 1, Xn = 0|Xn = 0, Xn1 = 1) = P(Xn+1 = 1|Xn = 0, Xn1 = 1) = 0.2.
(f) P(Xn+1 = 0, Xn = 0|Xn = 0, Xn1 = 1) = 1 P(Xn+1 = 1|Xn = 0, Xn1 = 1) = 0.8.
(g) P(Xn+1 = 1, Xn = 0|Xn = 0, Xn1 = 0) = P(Xn+1 = 1|Xn = 0, Xn1 = 0) = 0.8.
(h) P(Xn+1 = 0, Xn = 0|Xn = 0, Xn1 = 0) = P(Xn+1 = 0|Xn = 0, Xn1 = 0) = 0.2.
All the other conditional probabilities are zeros. To sum up, the transition matrix is as follows:

0.4 0 0.6 0
0.7 0 0.3 0

P =
0 0.2 0 0.8 .
0 0.8 0 0.2
Accordingly, the transition diagram is shown in Figure 4.
0.8

0.7

0.4

1,1

0.2

0,1

1,0
0.3

0,0
0.8

0.6

Figure 4: Transition diagram for problem #4

0.2

3. We have
P(X1 = 1, X2 = 1, X4 = 0) =
=

1
X
i=0
1
X

P(X1 = 1, X2 = 1, X3 = i, X4 = 0)
P(X1 = 1) P(X1 = 1, X2 = 1|X0 = 1, X1 = 1)

i=0

P(X2 = 1, X3 = i|X1 = 1, X2 = 1) P(X3 = i, X4 = 0|X2 = 1, X3 = i)


=

1
X

P(X2 = 1|X0 = 1, X1 = 1)

i=0

P(X3 = i|X1 = 1, X2 = 1) P(X4 = 0|X2 = 1, X3 = i)


= 0.4 0.6 0.8 + 0.4 0.4 0.6
= 0.288.

Problem #5 (14pt)
A Markov chain {Xn , n 0} with sates 0, 1, 2, has the transition probability matrix

1
1
2 a 6

P = 0 31 b .
1
2 0 c
1. (4pt) Decide values a, b, and c.
2. (4pt) If P(X0 = 0) = P(X0 = 1) = 14 , find the PMF of X3 .
3. (6pt) If P(X0 = 0) = P(X0 = 1) = 41 , find E[X3 ].
Solution:
1. Due to the law of total probability, we have
b = 23 and c = 12 .

1
2

+a+

1
6

= 1, and hence a = 31 . Similarly, we have

2. Since P(X0 = 0) = P(X0 = 1) = 41 , we have P(X0 = 2) = 1 P(X0 = 0) P(X0 = 1) = 12 .


Hence, the PMF of X3 is
1
2

a(3) = a(0) P 3 =

1
4

1
4

1
2


0
1
2

1
3
1
3

1 3
6
2
3
1
2



= 0.41 0.20 0.39 .

That is, P(X3 = 0) = 0.41, P(X3 = 1) = 0.20, and P(X3 = 3) = 0.39.

3. By part 2, we have
E[X3 ] = (0)P(X3 = 0) + (1)P(X3 = 1) + (2)P(X3 = 2) = 0.98.

Problem #6 (14pt)
Coin 1 has probability 0.6 of coming up heads, and coin 2 has probability 0.5 of coming up heads.
Suppose that we flip a coin every minute. If the coin comes up heads, we will flip this coin again
the next minute. If the coin comes up tails, we will put this coin aside and flip the other coin the
next minute.
1. (3pt) Define a Markov chain with 2 states which will help us to describe which coin we flip
in each minute, i.e., define {Xn : n 1} and what Xn represents.
2. (5pt+3pt) Write down the transition matrix and draw the transition diagram of this Markov
chain.
3. (3pt) If we start the process with coin 1, what is the probability that coin 2 is used on the
fifth flip?
Solution:
1. Let Xn represent which coin we use in minute n, such that Xn = i represents that we use coin
i for i = 1, 2. {Xn , n 0} is a discrete-time stochastic process, and it has the Markovian
property because the distribution of Xn+1 only depends on Xn . Hence, {Xn , n 0} is a
Markov chain.
2. Let Yi represents the result of flipping coin i for i = 1, 2, i.e., Yi = H or Yi = T . To obtain
the transition matrix, we discuss the following cases: (note: the following detailed discussions
are only for reference, which you can skip for simplicity once if you can correctly obtain the
transition matrix.)
(a) P(Xn+1 = 1|Xn = 1) = P(Y1 = H) = 0.6.
(b) P(Xn+1 = 2|Xn = 1) = P(Y1 = T ) = 0.4.
(c) P(Xn+1 = 1|Xn = 2) = P(Y2 = T ) = 0.5.
(d) P(Xn+1 = 2|Xn = 2) = P(Y2 = H) = 0.5.
To sum up, the transition matrix is as follows:


0.6 0.4
P =
.
0.5 0.5
Accordingly, the transition diagram is shown in Figure 5.

0.5

0.6

0.5

0.4

Figure 5: Transition diagram for problem #6


3. The 4-step transition matrix is


0.5556 0.4444
P =
.
0.5555 0.4445
4

Hence, we have
P(X5 = 2|X1 = 1) = P 4


12

= 0.4444.

(Note: we did not specify which step start the process with coin 1 refers to, and so you can
also consider the 5-step transition matrix which gives the same result.)

Problem #7 (15pt)
An individual possesses 2 umbrellas which he employs in going from his home to office, and vice
versa. If he is at home (the office) at the beginning (end) of a day and it is raining, then he will
take an umbrella with him to the office (home), provided there is one to be taken. If it is raining
and he has no umbrellas to be taken, then he gets wet. If it is not raining, then he never takes
an umbrella. Assume that, independent of the past, it rains at the beginning (end) of a day with
probability 0.5.
1. (3pt) Define a Markov chain with 3 states which will help us to analyze how many umbrellas he
has at home and at the office at the beginning of each day, i.e., define {Xn : n 1} and what
Xn represents.
2. (5pt+3pt) Write down the transition matrix and draw the transition diagram of this Markov
chain.
3. (5pt) If he has 1 umbrella at home at the beginning today 0, what is the probability that he has
2 umbrellas at home at the beginning of day 2?
Solution:
1. Let Xn represent the number of umbrellas the man has at home at the beginning of day n,
then Xn can only be 0, 1, or 2. {Xn , n 0} is Markov chain because (i) it is a discrete-time
stochastic process, and (ii) it has the Markovian property, i.e., the distribution of Xn+1 depends
only on Xn .
2. We first write down the transition matrix for {Xn , n 0}. To this end, we let RB and RE
represent if it rains at the beginning and end of a day, respectively. That is,


1, if it rains at the beginning of a day
1, if it rains at the end of a day
RB =
, RE =
.
0, o.w.
0, o.w.
9

We discuss the following cases: (note: the following detailed discussions are only for reference,
which you can skip for simplicity once if you can correctly obtain the transition matrix.)
(a) P(Xn+1 = 0|Xn = 0) = P(RE = 0) = 0.5.
(b) P(Xn+1 = 1|Xn = 0) = P(RE = 1) = 0.5.
(c) P(Xn+1 = 2|Xn = 0) = 0.
(d) P(Xn+1 = 0|Xn = 1) = P(RB = 1, RE = 0) = 0.25.
(e) P(Xn+1 = 1|Xn = 1) = P(RB = 1, RE = 1) + P(RB = 0, RE = 0) = 0.5.
(f) P(Xn+1 = 2|Xn = 1) = P(RB = 0, RE = 1) = 0.25.
(g) P(Xn+1 = 0|Xn = 2) = 0.
(h) P(Xn+1 = 1|Xn = 2) = P(RB = 1, RE = 0) = 0.25.
(i) P(Xn+1 = 2|Xn = 2) = P(RB = 0) + P(RB = 1, RE = 1) = 0.75.
To sum up, the transition matrix is as follows:

0.5 0.5
0
P = 0.25 0.5 0.25 .
0
0.25 0.75
Accordingly, the transition diagram is shown in Figure 6.
0.5
0.25

0.5

0.25

1
0.5

2
0.25

Figure 6: Transition diagram for problem #7


3. Since

0.375
0.5
0.125
P 2 = 0.25 0.4375 0.3125 ,
0.0625 0.3125 0.625

we have P(X2 = 2|X0 = 1) = 0.3125.

10

0.75

Potrebbero piacerti anche