0% found this document useful (0 votes)
20 views

Tutsheet8new

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

Tutsheet8new

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MTL 106 (Introduction to Probability Theory and Stochastic Processes)

Tutorial Sheet No. 8 (DTMC)


1. Suppose that a machine can be in two states {0, 1} where 0 implies working and 1 implies out of order on
a day. The probability that a machine is working on a particular day depends on the state of the machine
during two previous days. Specifically assume that P (X(n+1) = 0|X(n−1) = j, X(n) = k) = qjk j, k = 0, 1
where X(n) denotes the state of the machine on nth day.
(a) Show that {X(n), n = 1, 2, . . .} is not a discrete-time Markov chain.
(b) Define a new state space for the problem by taking the pairs (j, k) where j and k are 0 or 1. It is said
that machine is in state (j, k) on day n if the machine is in state j on day (n − 1) and in state k on nth
day. Show that in this case the state space of the system is a discrete-time Markov chain.
(c) It is given that the machine was working on Monday and Tuesday. Find the probability that it will be
working on Thursday?

2. A particle starts at the origin and moves to and fro on a straight line. At any move it jumps either 1 unit to
the right or 1 unit to the left each with probability 1/3 or 1/3 respectively. With probability 1/3, the particle

5
may stay at the same position in any move. Model this process as a stochastic process. Write the stochastic
process with state space and parameter space. Verify whether this process satisfies Markov property.

-2
3. Assume {Yn , n = 0, 1, . . .} is a sequence of i.i.d. random variables taking values on Z+ , with probability

4
distribution P [Yn = i] = ai (i = 0, 1, . . .). Let Xn+1 = (Xn − 1)+ + Yn . Prove that {Xn , n = 0, 1, . . .} is a

02
Markov chain and write down its transition probability matrix.
4. Let X0 be an integer-valued random variable, P (X0 = 0) = 1, that is independent of the i.i.d. sequence
Z1 , Z2 , . . ., where P (Zn = 1) = p, P (Zn = −1) = q and P (Zn = 0) = 1 − (p + q). Let Xn = max{0, Xn−1 +
r2
Zn }, n = 1, 2, . . .. Prove that {Xn , n = 0, 1, . . .} is a DTMC. Write the one-step TPM or draw the state
transition diagram for this Markov chain.
te

5. The owner of the local one-chair barbershop is thinking about expanding because there appear to be too
many people waiting there. Observations show that in the time required to cut one person’s hair, there
es

may be 0, 1, and 2 arrivals, with probabilities of 0.3, 0.4, and 0.3 correspondingly. There are constantly six
clients in the salon getting their hair cut. Let X(t) be the total number of customers at any given time t and
Xn = X (t+ n ) be the total number of customers at the moment the nth person’s hair cut is finished. Using
em

the assumption of i.i.d. arrivals, demonstrate that {Xn , n = 1, 2, . . .} is a Markov chain. Find the probability
matrix for its single transition.
IS

6. Assume {Yn , n = 0, 1, . . .} is a sequence of i.i.d. random variables taking values on Z+ , with probability
distribution P [Yn = i] = ai (i = 0, 1, . . .). Let Xn+1 = (Xn − 1)+ + Yn . Prove that {Xn , n = 0, 1, . . .} is a
Markov chain and write down its transition probability matrix.
7. Let X0 be an integer-valued random variable, P (X0 = 0) = 1, that is independent of the i.i.d. sequence
Z1 , Z2 , . . ., where P (Zn = 1) = p, P (Zn = −1) = q and P (Zn = 0) = 1 − (p + q). Let Xn = max{0, Xn−1 +
Zn }, n = 1, 2, . . .. Prove that {Xn , n = 0, 1, . . .} is a DTMC. Write the one-step TPM or draw the state
transition diagram for this Markov chain.

8. The one step transition probability matrix of DTMC {Xn , n = 0, 1, . . .} with state space {1, 2, 3} is
 
0.1 0.5 0.4
P =  0.6 0.2 0.2 
0.3 0.4 0.3

and the initial distribution is π0 = (0.7, 0.2, 0.1). Find


(a) P (X2 = 3)
(b) P (X3 = 2, X2 = 3, X1 = 3, X0 = 2)

1
9. Consider the DTMC {Xn , n = 0, 1, . . .} with three states S = {1, 2, 3} that has the following transition
matrix:  1 1 1 
2 4 4
1 2
P= 3 0 3
.
1 1
2 2 0
1
Assume that P (X0 = 1) = P (Xo = 2) = 4 and P (Xo = 3) = 12 . Find P (X0 = 3, X1 = 2, X2 = 1).
10. For a Markov chain {Xn , n = 0, 1, . . .} with state space E = {0, 1, 2, 3, 4} and transition probability matrix
P given below,
 1 classify the states
 of the chain. Also  determine the closed
 communicating classes.
1 1 1 1
2 0 0 2 0 0 3 0 3 3
 1 1 0 0 0   1 1 0 1 0 
 12 21   3 3 3 
1  2 1 
(a) P =  4 2 0 0 4 . (b) P =   01 01 3 01 13  .
 0 0 0 1 0   0 4 4 
4 4
1 1 1
0 0 2 4 4 0 0 31 0 23
11. Consider the chain with transition matrix
 
1/6 1/3 1/2 0

5
 1/2 1/2 0 0 
P=
 1/6 1/3 1/2 0  .

-2

0 1/6 1/3 1/2

4
Identify the closed class.

02
12. Show that if a Markov Chain is irreducible and Pii > 0 for some state i then the chain is aperiodic.
r2
13. Consider the Markov chain {Xn , n = 0, 1, 2}with S = {0, 1, 2} and transition probability matrix given as:
 1 1 1 
4 2 4
1 2
P= 0 .
te

3 3
1 1
2 0 2
es

(a) Classify the chain as irreducible, aperiodic and find the stationary distribution.
(b) Assume X0 = 1 and let R be the first time that the chain returns to state 1, i.e., R = min{m ≥ 1 : Xn = 1}.
em

Find E[R|X0 = 1].


 
0.4 0.6 0
14. Consider a DTMC with transition probability matrix  0.4 0 0.6 . Find the stationary distribution
IS

0 0.4 0.6
for this Markov chain.
15. Consider a Markov Chain with state space S = {0, 1, 2, 3, 4} and transition probability matrix
 
1 0 0 0 0
 0 0.25 0.75 0 0 
 
P= 0 0.5 0.5 0 0  .
 0.25 0.25 0 0.25 0.25 
0 0 0 0.5 0.5

(a) Classify the states of the chain.


(b) Determine the stationary distribution for states 1 and 2.
(c) For the transient states, calculate µij , the expected number of visits to transient state j, given that the
process started in a transient state i.

2
(d) Find P (X5 = 2 | X3 = 1).
16. A mathematics professor has 2 umbrellas. He keeps one of them at home and the other in the office. Every
morning, when he leaves home, he checks the weather and takes an umbrella with him if it rains. In case
both the umbrellas are in the office, he gets wet. The same procedure is repeated in the afternoon when
he leaves the office to go home. The professor lives in a tropical region, therefore, the chance of rain in the
afternoon is higher than in the morning; it is 1/5 in the afternoon and 1/20 in the morning. Whether it rains
or not is independent of whether it rained the last time he checked. On day 0, there is an umbrella at home,
and 1 in the office. Note that, there are two trips each day. What is the expected number of days that will
pass before the professor gets wet? What is the probability that the first time he gets wet it is on his way
home from the office ?

17. A random walker walks on the integers. At each step, he moves one step right with probability 13 , two steps
left with probability 13 , and stays on his place with probability 13 . What is the probability that the walker
will return to his original location after three steps?
18. Show that the DTMC with countable state space S = {. . . , −2, −1, 0, 1, 2, . . .} and transition probabilities
pi,i+1 = 1 − p, pi,i−1 = p, i ∈ S is recurrent when p = 12 and transient otherwise.

5
Pm Pm
19. Assume that P = (pij ) is a finite irreducible double stochastic matrix, i.e., k=1 pkj = k=1 pik = 1 for

-2
each i = 1, 2, . . . , m. Prove that the stationary distribution for this Markov chain is the uniform distribution
on {1, 2, . . . , m}.

4
20. Show that the birth and death chain is recurrent if and only if

X
02γk = ∞
r2
k=0

where, γ0 = 1 and
q1 · · · qk
te

γk = , k = 1, 2, . . .
p1 · · · pk
qi = pi,i−1 , i = 1, 2, . . .
es

pi = pi,i+1 , i = 1, 2, . . . .
em

21. Show that an irreducible birth and death chain on S = {0, 1, . . .} is recurrent if and only if

X q1 · · · qk
=∞
IS

p1 · · · pk
k=0

where
qi = pi,i−1 and pi = pi,i+1 , i = 1, 2, . . .

22. Consider the birth and death chain on {0, 1, . . .} with

k+2 k
pk = and qk = ,k ≥ 0
2(k + 1) 2(k + 1)

determine whether the chain is recurrent or transient.


Fij
23. Prove that µij = 1−Fij , i, j ∈ S.
(n)
24. Prove that if i is a recurrent state and suppose pij > 0 for some n, then state j is recurrent and Fij = Fji = 1.

3
25. Let
(n)
fjk = P [Xn = k, Xm ̸= j, 0 < m < n|X0 = j]
be the probability that the chain starting from state j, the chain leads to state k at the nth step without
(n) (n)
hitting j in the middle. Prove that, if pjk > 0, then there exists m ≤ n such that fjk > 0.

26. Let {Xn , n = 0, 1, . . .} be a DTMC. Prove that the distribution of Xn is independent of n if and only if the
initial distribution π is a stationary distribution.
27. Consider a DTMC model arising in an insurance problem. To compute insurance or pension premiums for
professional diseases such as silicosis, it is needed to compute the average degree of disability at pre-assigned
time periods. Suppose that, m degrees of disability S1 , S2 , . . . , Sm are retained. Assume that an insurance
policy holder can go from degree Si to degree Sj with a probability pij . This strong assumption leads to
the construction of the DTMC model in which P = [pij ] is the one step transition probability matrix related
to the degree of disability. Using real observations recorded in India, the following transition matrix P is
considered :  
0.90 0.10 0 0 0
 0 0.95 0.05 0 0 

5
 
P= 0  0 0.90 0.05 0.05  
 0 0 0 0.90 0.10 

-2
0 0 0.05 0.05 0.90

4
(a) Classify the states of the chain as transient, positive recurrent or null recurrent along with period.

02
(b) Find the limiting distribution for the degree of disability.

28. Consider the simple random walk on a circle. Assume that K odd number of points labeled 0, 1, . . . , K −1 are
r2
arranged on a circle clockwise. From i, the walker moves to i + 1 (with K identified with 0) with probability
p (0 < p < 1) and to i − 1 (with −1 identified with K − 1) with probability 1 − p. Find the steady state
distribution for this random walk, if it exist.
te

29. For j = 0, 1, . . ., let pjj+2 = vj and pj0 = 1 − vj , define the transition probability matrix of Markov chain.
Discuss the character of the states of this chain.
es

30. Let, {Xn , n = 0, 1, . . .} be a time-homogeneous DTMC with state space S = {0, 1, 2, 3, 4} and one-step
em

transition probability matrix  


1 0 0 0 0
 0.5 0 0.5 0 0 
 
P= 0 0.5 0 0.5 0 .
IS


 0 0 0.5 0 0.5 
0 0 0 0 1

(a) Classify the states of the chain as transient, +ve recurrent or null recurrent.
(b) When P (X0 = 2) = 1, find the expected number of times the Markov chain visits state 1 before being
absorbed.
(c) When P (X0 = 1) = 1, find the probability that the Markov chain gets absorbed in state 0 .
31. A rumor-spreading paradigm is one method of information dissemination over a network. Assume that there
are 5 hosts connected to the network. One host starts off by sending a message. Every round, the message
is sent by one host to another host who is selected independently and uniformly at random from the other
4 hosts. When all hosts have received the message, the process ends. Create a discrete-time model of this
process with Markov chains
(a) Xn be state of host (i = 1, 2, . . . , 5) who received the message at the end of nth round.

4
(b) Yn be number of hosts having the message at the end of nth round.
Find one step transition probability matrix for the above DTMC. Classify the states of the chains as transient,
positive recurrent or null recurrent.

5
4 -2
02
r2
te
es
em
IS

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy