Definitions
Definitions
Introduction
Noha Youssef
1 / 13
Table of Contents
Introduction
Definition of a Stochastic Process
Examples
Markov Property
2 / 13
Introduction
3 / 13
Examples
Below is a brief list of some important areas in which stochastic
processes arise:
Economics daily stock market quotations or monthly
unemployment figures.
Social sciences population birth rates and school enrollments series
have been followed for many centuries in several
countries.
Epidemiology numbers of influenza cases are often monitored over
long periods of time.
Medicine blood pressure measurements are traced over time to
evaluate the impact of pharmaceutical drugs used in
treating hypertension.
Markov Chain Monte Carlo (MCMC) is a modern statistical tool
which brings together results from SPs and
simulation enabling potentially computationally
intractable (Bayesian) statistics to be used in
practice.
4 / 13
Definition of a Stochastic Process
Definition 2.1
A stochastic process is a family of random variables {Xθ },
indexed by a parameter θ, where θ belongs to some index set θ.
5 / 13
Definitions
Definition 2.2
The index t is often interpreted as time and, thus, X(t) is referred
to as the state of the process at time t.
Definition 2.3
The set T is called index set (or parameter set).
Definition 2.4
If T is a countable set (e.g., N0 or Z) then we are dealing with a
discrete-time process.
Definition 2.5
If T is a continuum (e.g., R) then {X(t) : t ∈ T } is said to be a
continuous-time process.
6 / 13
More Definitions
Definition 2.6
For each t ∈ T , X(t) is a r.v. Any realization of the stochastic
process {X(t) : t ∈ T } is called a sample path.
Definition 2.7
The set of all possible values that the r.v. X(t) can take at all t,
say S, is said to be the state space of the stochastic process
{X(t) : t ∈ T }.
Definition 2.8
If S is a countable set (e.g., N0 or Z), {X(t) : t ∈ T } is a discrete
value process.
7 / 13
Example: Gambler’s ruin
Player A has $k, Player B has $(a − k), a > k > 0 play a series of
games in which A has prob. p of winning and q = 1 − p of losing.
Define r.v. Xn : A’s money after n games. Then (x1 , x2 , · · · , xn )
is a realisation of a discrete time random process.
8 / 13
Cont’D
Definition 2.9
If S is a continuum (e.g.,R) then {X(t) : t ∈ T } is a continuous
value process.
Definition 2.10
Discrete Time random process is observed only at specific times.
9 / 13
More Examples
10 / 13
More Definitions
11 / 13
Markov Property
13 / 13