IE 325 - Recitation 3 Partial Solutions
IE 325 - Recitation 3 Partial Solutions
Recitation 3
Partial Solutions
Spring 2020
Question 1. A miner is trapped in a mine containing three doors. The first door leads
to a tunnel that takes him to safety after two hours of travel. The second door leads to a
tunnel that returns him to the mine after three hours of travel. The third door leads to
a tunnel that returns him to his mine after five hours. Assuming that the miner is at all
times equally likely to choose any one of the doors,
(b) what is the expected length until the miner reaches safety?
Solution of Question 1. Let Xn be the number of the door miner chooses in the n’th
trial where n = 1, 2, · · · . Then Xi ∈ {1, 2, 3} (or {D1, D2, D3} to emphasize). If he
chooses door 1, then he reaches safety and stays in the safe zone. If he chooses door 2
or door 3, then in his next trial he chooses any of the other doors equally likely. Then
(Xn )n≥1 becomes a Markov chain with the transition matrix given by
D1 D2 D3
D1 1 0 0
P = D2 1/3 1/3 1/3
D3 1/3 1/3 1/3
Since he is initially at the mine and chooses one of the doors with equal probabilities, we
have
P(X1 = 1) = P(X1 = 2) = P(X1 = 3) = 1/3.
(a) Let N be the number of trials he needs to make till reaching safety. Let µi denote
E[N |X1 = i] for i = 1, 2, 3. Notice that µ1 = E[N |X1 = 1] = 1 is the boundary
value.
E[N ] = E[N |X1 = 1]P(X1 = 1) + E[N |X1 = 2]P(X1 = 2) + E[N |X1 = 3]P(X1 = 3)
1 1 1 1 1 1
= µ1 + µ2 + µ3 = 1 + µ2 + µ3 .
3 3 3 3 3 3
1
Now we compute µ2 :
(b) Let T be the time spent till reaching safety. Let vi denote E[T |X1 = i] for i = 1, 2, 3.
Notice that v1 = E[T |X1 = 1] = 2 is the boundary value.
E[T ] = E[T |X1 = 1]P(X1 = 1) + E[T |X1 = 2]P(X1 = 2) + E[T |X1 = 3]P(X1 = 3)
1 1 1 1 1 1
= v1 + v2 + v3 = 2 + v2 + v3 .
3 3 3 3 3 3
Now we compute v2 :
2
Now we compute v3 :
Question 2? . Consider the Markov chain whose transition probability matrix is given
by
0 1 2 3
0 0.25 0.25 0.25 0.25
1 0.1 0.6 0.1 0.2
P =
2 0.2 0.3 0.4 0.1
3 0.25 0.25 0.25 0.25
Notice that the Markov chain will not eventually be absorbed either in state 0 or in state
3, yet we are interested in the “stopped” process : Let T be the first time the process is
at state 0 or at state 3 that is T = min{n : Xn = 0 or Xn = 3}.
(c) Given that the Markov chain starts at state 1, what is the mean time until the first
time the process leaves state 1?
Solution of Question 2?(. a,b,c) Solutions for these parts are the same as for the above
problem.
3
(e) P(XT = 0|X1 = 0, X0 = 3) = 0 and P(XT = 0|X1 = 1, X0 = 3) = 0. In particular;