Elements of Probability
Elements of Probability
Elements of Probability
1.1. Sample Space and Events
Consider an experiment whose outcome is not known
• Sample space S: the set of all possible outcome
Flip a coin: S = {H, T}
Rolling a die: S = {1, 2, 3, 4, 5, 6}
Running a race among 7 horses numbered 1 thru 7:
S ={ all ordering of (1,2,3,4,5,6,7)}
A single stock with price St at time t = 1, 2, . . . , T
Ω: the set of all possible values of stock during these times
Ω = {ω : ω = (S1, S2, . . . , ST )}
If we assume that the stock price can go up by a factor u and
down by a factor d, then the relevant information reduces to
the knowledge of the movements at each time
Ω = {ω : ω = (a1, a2, . . . , aT )}, at = u or d.
For example: T = 2, then
Ω = {uu, ud, du, dd}
1.1
• Event: any subset A of the sample space is known as an
event
Event which getting a H: A={H}
Event to have a even number when rolling a die
A={2,4,6}
Event that the number 5 horse comes first
A ={ all outcomes in S starting with 5}
Event that the stock goes up at time t = 1
A = {uu, ud}
1.2
1.2. Axioms of Probability
Axiom 1 0 ≤ P (A) ≤ 1
Axiom 2 P (S) = 1
Axiom 3 For any sequence of mutually exclusive events
A1, A2, . . .,
n n
p Ai = P (Ai), n = 1, 2, . . . , ∞
[ X
i=1 i=1
1.3
1.3. Usual definition
Suppose that an experiment, whose sample space is S, is repeat-
edly performed under exactly same conditions.
For each event A of the sample space S, we define n(A) to be
the number of times in the first n repetitions of the experiment
that the event A occurs. Then the probability of the event A is
defined by
n(A)
P (A) = n→∞
lim
n
For some experimenters it is natural to assume that all outcomes
in the sample space are equally likely to occur, i.e., consider
an experiment whose sample space is a finite set S , say S =
{1, 2, . . . , N }
Then it is often natural to assume that
P ({1}) = P ({2}) = · · · = P ({N })
which implies from Axioms 2 and 3 that
1
P ({i}) = , i = 1, 2, . . . , N.
N
From this, it follows from Axiom 3 that for any event E
number of points in E
P (E) =
number of points in S
1.4
1.4. Some simple propositions
• P (Ac) = 1 − P (A)
• If E ⊂ F , then P (E) ≤ P (F )
• P (E ∪ F ) = P (E) + P (F ) − P (EF )
•
n
P (E1 ∪ E2 ∪ · · · ∪ Un) = P (Ei) − P (EiEj ) + · · ·
X X
i=1 i1 <i2
+(−1)r+1 P (Ei1 Ei2 · · · Eir )
X
i1 <i2 <···<ir
n+1
+ · · · + (−1) P (E1E2 · · · En)
where i1 <i2 <···<ir P (Ei1 Ei2 · · · Eir ) is taken over all of the
P
n
possible subsets of size r of the set {1, 2 . . . , n}.
r
1.5
1.5. Conditional Probability and Independence
Consider the previous example, suppose we are interested in the
probability that two heads are obtained provided that head is
landed on the first flip
In this case we call it as conditional probability that A occurs
given B has occurred and denote by P {A|B}
1.6
Note that P (·|F ) is a probability
1.7