IME625A
IME625A
IME625A
HW
Monday, February 14, 2022 9:43 PM
1 2 3 4 5 6 7
1 1/3 2/3 0 0 0 0 0
2 1/4 3/4 0 0 0 0 0
3 0 0 1/3 0 2/3 0 0
4 0 0 1 0 0 0 0
5 0 0 1 0 0 0 0
6 1/6 0 1/6 1/6 0 1/4 1/4
7 0 0 0 0 0 0 1
C12 Page 1
C12.P1
Tuesday, February 15, 2022 12:52 PM
The last part of the above argument relies on the following: Let
denote the set of states that can be reached from . If all lead
to , then is a communicating block. Verify validity of this assertion
yourself.
Let denote the state through which the Markov chain enters .
Then for the Markov chain . Let denote an arbitrary state in .
Then the claim is that is recurrent. Assuming otherwise, let be
transient. It can be proved that if is transient, then
for all . We have shown this for . A proof for the general case
appears in Appendix A. Since the series is convergent, its
limiting term must be zero, i.e., . Then
.
If is infinite, then the change in the order of limit and sum is not
permitted (even with the convergence theorems), and we do not reach the
contradiction that . In fact, infinite communicating blocks can be of
either type, i.e., all states are either recurrent or transient. We provide
examples of both kinds next.
C12 Page 2
contradiction that . In fact, infinite communicating blocks can be of
either type, i.e., all states are either recurrent or transient. We provide
examples of both kinds next.
C12 Page 3
C12.P2
Tuesday, February 15, 2022 12:53 PM
In this case, with time, the Markov chain may drift towards if
or towards if , and never return to again.
C12 Page 4
for , is finite, then all states in are recurrent. If some is
infinite, then we need to check the nature of any one of the states in to
know the nature of all states in .
C12 Page 5
C12.P3
Tuesday, February 15, 2022 12:54 PM
The above result is intuitive. On an average the chain returns to state after
C12 Page 6
The above result is intuitive. On an average the chain returns to state after
every number of transitions. Then the long-run fraction of visits to
should be , which becomes zero if . It shall be noted
that does not depend on (the starting state) for recurrent Markov
chains, which is one kind of irreducible Markov chain. We will see that the
same is true for the other kind of irreducible Markov chain, i.e., transient
chain, as well. Thus, for the irreducible Markov chains, is the same for
all and is denoted by . This observation is not true for general Markov
chains.
. Then . Similarly,
one can verify that .
C12 Page 7