0% found this document useful (0 votes)
11 views4 pages

07 Markov Chains

A Markov chain is a mathematical system that transitions between states based on probabilistic rules, with the state at time t represented as Xt. The Markov Property states that the next state depends only on the current state, not on previous states. Types of Markov chains include finite, infinite, continuous-time, and discrete-time chains, each with distinct characteristics and applications.

Uploaded by

meghanaalluri2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views4 pages

07 Markov Chains

A Markov chain is a mathematical system that transitions between states based on probabilistic rules, with the state at time t represented as Xt. The Markov Property states that the next state depends only on the current state, not on previous states. Types of Markov chains include finite, infinite, continuous-time, and discrete-time chains, each with distinct characteristics and applications.

Uploaded by

meghanaalluri2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Markov chains

A Markov chain, named after Andrey Markov, is a mathematical system that


experiences transitions from one state to another according to
certain probabilistic rules.

The Markov chain is the process X0, X1, X2, . . ..

Definition: The state of a Markov chain at time t is the value of Xt . For example, if
Xt = 6, we say the process is in state 6 at time t.

Definition: The state space of a Markov chain, S, is the set of values that each X t
can take. For example, S = {1, 2, 3, 4, 5, 6, 7}.

Definition: A trajectory of a Markov chain is a particular set of values for X0, X1, X2,
. . .. For example, if X0 = 1, X1 = 5, and X2 = 6, then the trajectory up to time t = 2
is 1, 5, 6.

More generally, if we refer to the trajectory s 0, s1, s2, s3, . . ., we mean that X0 = s0,
X1 = s1, X2 = s2, X3 = s3, . . . ‘Trajectory’ is just a word meaning ‘path’

Markov Property:

The basic property of a Markov chain is that only the most recent point in the
trajectory affects what happens next.

This is called the Markov Property. It means that X t+1 depends upon Xt , but it does
not depend upon Xt−1, . . . , X1, X0.

We formulate the Markov Property in mathematical notation as follows:

for all t = 1, 2, 3, . . . and for all states s 0, s1, . . . , st , s.

Explanation:
Definition: Let {X0, X1, X2, . . .} be a sequence of discrete random variables. Then
{X0, X1, X2, . . .} is a Markov chain if it satisfies the Markov property:

for all t = 1, 2, 3, . . . and for all states s 0, s1, . . . , st , s.

Example:

A Markov chain built on states A and B is depicted in the diagram below. What is the
probability that a process beginning on A will be on B after 2 moves?

In order to move from A to B, the process must either stay on A the first move, then
move to B the second move; or move to B the first move, then stay on B the second
move. According to the diagram, the probability of that is 0.3⋅0.7+0.7⋅0.2=0.35

Alternatively, the probability that the process will be on A after 2 moves


is 0.3⋅0.3+0.7⋅0.8=0. Since there are only two states in the chain, the process must
be on B if it is not on A, and therefore the probability that the process will be on B
after 2 moves is 1−0.65=0.35

The Transition Matrix

A transition matrix P for Markov chain {X} is a matrix containing information on the
probability of transitioning between states. In particular, given an ordering of a
matrix's rows and columns by the state space S, the (𝑖,)th element of the matrix P is
given by

This means each row of the matrix is a probability vector , and the sum of its entries
is 1.

Notes:

1. The transition matrix P must list all possible states in the state space S.
2. P is a square matrix (N × N), because Xt+1 and Xt both take values in the
same state space S (of size N).
3. The rows of P should each sum to 1.
4. The columns of P do not in general sum to 1.
Definition: Let {X0, X1, X2, . . .} be a Markov chain with state space S, where S has
size N (possibly infinite). The transition probabilities of the Markov chain are

Definition: The transition matrix of the Markov chain is P = (pij).

Example: setting up the transition matrix

We can create a transition matrix for any of the transition diagrams. For example,
check the matrix below.

Types of Markov Chains:

There are several different types of Markov chains, including:

1. Finite Markov chains: These are Markov chains with a finite number of
states. The transition probabilities between states are fixed, and system will
then eventually reach a steady state in which the probabilities of being in
each state become constant. An example of a finite state Markov chain might
be a model of a traffic light, with three states (red, yellow, green) and
transitions governed by the rules of the traffic light.

2. Infinite Markov chains: These are Markov chains with an infinite number
of states. The transition probabilities between states are fixed, but the system
may not reach a steady state. For example, a model of the spread of a virus
through a population. The states of the Markov chain represent the number
of people who have been infected at any given time, and the transitions
between states are governed by the rate of infection and the rate of recovery.

3. Continuous-time Markov chains (CTMC): These are Markov chains in


which the transitions between states occur at random times rather than at
discrete time intervals. The transition probabilities are defined by rate
functions rather than probabilities. For example, the state of a CTMC could
represent the number of customers in a store at a given time, and the
transitions between states could represent the arrival and departure of
customers. The probability of a system being in a particular state (e.g., the
probability that there are a certain number of customers in the store) can be
calculated using the CTMC model.

4. Discrete-time Markov chains (DTMC): These are Markov chains in which


the transitions between states occur at discrete time intervals. The transition
probabilities are defined by a transition matrix. For example, the state of a
DTMC could represent the weather on a particular day (e.g., sunny, cloudy, or
rainy), and the transitions between states could represent the change in
weather from one day to the next. The probability of a system being in a
particular state (e.g., the probability of it being sunny on a particular day) can
be calculated using the DTMC model.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy