07 Markov Chains
07 Markov Chains
Definition: The state of a Markov chain at time t is the value of Xt . For example, if
Xt = 6, we say the process is in state 6 at time t.
Definition: The state space of a Markov chain, S, is the set of values that each X t
can take. For example, S = {1, 2, 3, 4, 5, 6, 7}.
Definition: A trajectory of a Markov chain is a particular set of values for X0, X1, X2,
. . .. For example, if X0 = 1, X1 = 5, and X2 = 6, then the trajectory up to time t = 2
is 1, 5, 6.
More generally, if we refer to the trajectory s 0, s1, s2, s3, . . ., we mean that X0 = s0,
X1 = s1, X2 = s2, X3 = s3, . . . ‘Trajectory’ is just a word meaning ‘path’
Markov Property:
The basic property of a Markov chain is that only the most recent point in the
trajectory affects what happens next.
This is called the Markov Property. It means that X t+1 depends upon Xt , but it does
not depend upon Xt−1, . . . , X1, X0.
Explanation:
Definition: Let {X0, X1, X2, . . .} be a sequence of discrete random variables. Then
{X0, X1, X2, . . .} is a Markov chain if it satisfies the Markov property:
Example:
A Markov chain built on states A and B is depicted in the diagram below. What is the
probability that a process beginning on A will be on B after 2 moves?
In order to move from A to B, the process must either stay on A the first move, then
move to B the second move; or move to B the first move, then stay on B the second
move. According to the diagram, the probability of that is 0.3⋅0.7+0.7⋅0.2=0.35
A transition matrix P for Markov chain {X} is a matrix containing information on the
probability of transitioning between states. In particular, given an ordering of a
matrix's rows and columns by the state space S, the (𝑖,)th element of the matrix P is
given by
This means each row of the matrix is a probability vector , and the sum of its entries
is 1.
Notes:
1. The transition matrix P must list all possible states in the state space S.
2. P is a square matrix (N × N), because Xt+1 and Xt both take values in the
same state space S (of size N).
3. The rows of P should each sum to 1.
4. The columns of P do not in general sum to 1.
Definition: Let {X0, X1, X2, . . .} be a Markov chain with state space S, where S has
size N (possibly infinite). The transition probabilities of the Markov chain are
We can create a transition matrix for any of the transition diagrams. For example,
check the matrix below.
1. Finite Markov chains: These are Markov chains with a finite number of
states. The transition probabilities between states are fixed, and system will
then eventually reach a steady state in which the probabilities of being in
each state become constant. An example of a finite state Markov chain might
be a model of a traffic light, with three states (red, yellow, green) and
transitions governed by the rules of the traffic light.
2. Infinite Markov chains: These are Markov chains with an infinite number
of states. The transition probabilities between states are fixed, but the system
may not reach a steady state. For example, a model of the spread of a virus
through a population. The states of the Markov chain represent the number
of people who have been infected at any given time, and the transitions
between states are governed by the rate of infection and the rate of recovery.