EPM944 Lecture2 v2 Copy
EPM944 Lecture2 v2 Copy
EPM944 Lecture2 v2 Copy
I The best way to start thinking about risky events is to use the
tools of probability theory.
I Let A denote an event that “the loss of an investment is
greater than 1 million pounds” and the probability of the
event is 10%.
I In practice, we often say the “risk” of event A is 10%.
I This risk must be distinguished from the value
1, 000, 000 × 10% = 100, 000, which is the statistical average
of loss above 1 million!
Probability theory: Sample Space and Events
I Commutativity: A ∪ B = B ∪ A, A ∩ B = B ∩ A.
I Associativity: A ∪ (B ∪ C ) = (A ∪ B) ∪ C ,
A ∩ (B ∩ C ) = (A ∩ B) ∩ C .
I Distributivity: A ∩ (B ∪ C ) = (A ∩ B) ∪ (A ∩ C ),
A ∪ (B ∩ C ) = (A ∪ B) ∩ (A ∪ C ).
I De-Morgan laws: (A ∪ B)c = Ac ∩ B c , (A ∩ B)c = Ac ∪ B c .
Every subset of the sample space Ω is an event:
I Ω = the certain event, ∅ = the impossible event.
I Ac = the event that A did not occur.
I A ∪ B = the event that either A or B occurred.
I A ∩ B = the event that both A and B occurred.
I i=1 Ai = the event that at least one Ai occurred.
S
∞ ∞
!
[ X
P Ai = P(Ai )
i=1 i=1
Elementary properties of probability
A B C
B
C
Probability of union events
A B C
B
C
Comments
For n events A1 , A2 , . . . , An :
X X
P (∪ni=1 Ai ) = P(Ai ) − P (Ai1 ∩ Ai2 ) + . . .
i≤n i1 <i2
X
+ . . . + (−1)n
P Ai1 ∩ Ai2 ∩ . . . ∩ Ain−1
i1 <i2 <...<in−1
n+1
+ (−1) P (A1 ∩ A2 ∩ . . . ∩ An )
Probability of intersection risk
(1 − 0.000001)365 = 0.999635066
1 − 0.999635066 = 0.000364934
How to avoid exclusion and inclusion in computing union
risk?
P(A ∪ B ∪ C ) = 1 − P((A ∪ B ∪ C )c )
= 1 − 0.995804
= 0.004196
A generalization
Theorem 2.1
Let Ai , i = 1, · · · , n be events. Then
n n
! !
[ \
P Ai = 1 − P Aci .
i=1 i=1
Random variables (revision)
x {X ≤ x} FX (x)
−1 ∅ 0
1
0 {TTT } 8
1
1 {TTT , TTH, THT , HTT } 2
7
2 {HHH}c 8
3 Ω 1
4 Ω 1
Example (cont.)
I The variance of X :
X
σX2 = Var (X ) = E {[X − E (X )]2 } = (xk − µX )2 pX (xk )
k
(k ≤ x < k + 1).
I The Binomial random variable describes the total number of
successes in a random experiment consisting of n independent
Bernoulli trials, each trial having probability of success p and
probability of failure q = 1 − p.
I Using the binomial expansion theorem it follows that
µX = E (X ) = np and σX2 = np(1 − p).
Binomial distribution
0.3
0.25
0.2
pmf
0.15
0.1
0.05
0
0 1 2 3 4 5 6 7 8 9 10
k
U = max Xi .
i=1,··· ,N
Then:
Definition 1
The events {An } form a partition of the sample space Ω if the
following holds:
[
I An = Ω;
n
I Ai
T
Aj = ∅ for i 6= j.
Theorem 2.2
(Law of total probability) If the events {An } form a partition, then
X
P(B) = P(B|An )P(An ).
n
Total Probability
P
Figure: Total probability P(B) = n P(B|An )P(An )
Example 2.3
An engineering firm needs to make a decision as to whether to
develop a new product. Profit depends on whether there is
competition at the time when the product is released to the
market. The probability of having competition (s1 ) or no
competition (s2 ) depends on market demand. Let L= demand
low, H= demand high. The following table gives the
probabilities of H and L and the conditional probabilities of
having competition.
Calculate P(s1 ).
Example (cont.)
P(B|A)P(A)
P(A|B) =
P(B|A)P(A) + P(B|Ac )P(Ac )
(0.99)(0.001)
= = 0.165
(0.99)(0.001) + (0.005)(0.999)
Using sampling to reduce uncertainty (cont.)