Lecture 2 Review of Probabilty Theory
Lecture 2 Review of Probabilty Theory
Lecture 2 Review of Probabilty Theory
A. W. Umrani
1
Outlines
WHAT IS PROBABILITY
ORIGINS OF PROBABILITY
WHY STUDY PROBABILITY?
PROBABILITY IN COMMUNICATIONS
SOME BASICS
2
WHAT IS PROBABILITY
3
WHY STUDY PROBABILITY?
4
WHY STUDY PROBABILITY?
Random Experiment
– The concept behind probabilities is called the
random experiment.
– A random experiment is an experiment that can be
repeated over and over, giving different results.
– a random experiment is a process whose outcome is
uncertain. For example,
– Tossing a coin once or several times
– Picking a card or cards from a deck
– Measuring temperature of patients
9
SOME BASICS
Example
10 days; check the weather
each day. Each time we
check,
it’s a random experiment.
Axioms of Probability
– For any event A, 0 ≤ P(A) ≤ 1.
– P(Ω) =1.
– If A1, A2, … An is a partition of A, then
P(A) = P(A1) + P(A2) + ...+ P(An)
11
SOME BASICS
Properties of Probability
– For any event A, P(Ac) = 1 - P(A).
– If A ⊂ B, then P(A) ≤ P(B).
– For any two events A and B,
P(A ∪ B) = P(A) + P(B) - P(A ∩ B).
For three events, A, B, and C,
P(A∪B∪C) = P(A) + P(B) + P(C) -
P(A∩B) - P(A∩C) - P(B∩C) + P(A∩B ∩C).
12
SOME BASICS
The probability of an event A is, 0 ≤ P ( A) ≤ 1
The probability of tossing a fair coin and getting heads =
0.5;
So if I toss a coin 10 times will I then get exactly 5
heads?
The probability of getting 10 heads in ten throws is
(0.5)10 or 1 in 1024 (each throw is an independent
event).
If the outcome of an experiment is either event A, event
B or event C, then P(A)+P(B)+P(C) = 1 (exhaustive and
mutually exclusive events).
13
Sample Space and events
A B
Conditional probability
number of elements in A ∩ B
P( A | B) =
number of elements in B
number of ways A and B can occur
P( A | B) =
number of ways B can occur
.
Solution of Example I
A1 A2 A3
B
Bayes’ Rule (named after Sir Thomas
Bayes)
P ( Ai ∩ B )
P ( Ai | B ) =
P( B)
P ( Ai ∩ B )
P ( B | Ai ) =
P ( Ai )
Bayes’ Rule (named after Sir Thomas
Bayes)
P( Ai ∩ B) P( Ai ) P( B | Ai ) P( Ai ) P( B | Ai )
=
P( Ai | B) = =
P( B) P( B) P( B | A1 ) P( A1 ) + P( B | A2 ) P( A2 ) + + P( B | An ) P( An )
A1 A2 A3
B
Example I
Solution
a) Let D denote the event that the bit selected is in
error. Let A,B and C denote the events that the selected
bit is from transmitters A,B and C respectively.
P( D) = P( D ∩ [ A ∪ B ∪ C ]) = P( D ∩ A) + P( D ∩ B) + P( D ∩ C )
= P( D| A) P( A) + P( D| B) P( B) + P( D|C ) P(C )
= ( 101 × 100
600 ) + ( 20 × 600 ) + ( 30 × 600 ) = 20
1 200 1 300 1
Example I
A B C
P ( A ∩ D) P ( D| A) P ( A) 1
P ( A| D) = = =
P ( D) P ( D| A) P ( A) + P ( D| B ) P ( B ) + P ( D| C ) P (C ) 3
Example II
1 0 1 1 0 1 0 1 1 1
Comms channel
with additive
noise
P ( B / A)
P(A) 1 1 P(B)
P( B / A )
P ( B / A)
P ( A )0 0
P( B )
P( B / A )
Example II
P ( B ) = P ( B| A) P ( A) + P ( B| A ) P ( A ) = 0.56
Comments on Example2
0.9
0.6 1 1 0.56
0.05
Binary Symmetric Channel (BSC)
0.1
0.4 0 0.95
0 0.44
Example II
0
V(t)
-2
-4
0 20 40 60 80 100
time, t
CONTINUOUS UNIFORMLY DISTRIBUTED R.V.
4
0
V(t)
-2
-4
0 20 40 60 80 100
time, t
A discrete random variable X(n)
UNIFORMLY DISTRIBUTED DISCRETE R.V.
2
0
X(n)
-2
-4
0 20 40 60 80 100
sample number, n
GAUSSIAN/NORMALLY DISTRIBUTED DISCRETE R.V.
20
10
0
X(n)
-10
-20
0 20 40 60 80 100
sample number, n
Definition of a Random Variable (Discrete
or Continuous)
3
1/6
X(n)
1
0 1 2 3 4 5 6 7 xi
0
0 5 10 15 20 25 30
THROW NUMBER n
Examples of mass functions in
communications
P( X = k ) = nCk p k (1 − p) n− k , k = 0,1,2,..., n
where µ X = np, and σ X2 = np(1 − p).
0.16
0.14
0.12
0.1
P(X=k)
0.08 ) e − λ λ k / k !, =
P ( X= k= k 0,1, 2,...
0.06
µ X λ , and =
where = σ X2 λ=
(λ 5)
0.04
0.02
0
0 5 10 15 20 25 30 35 40 45 50
k
Probability Mass Function for a
Poisson Random Variable (2)
0.45
0.4
) e − λ λ k / k !, =
P ( X= k=
0.35
0.3
k 0,1, 2,...
µ X λ , and =
where = σ X2 λ=
(λ 0.75)
P(X=k)
0.25
0.2
0.15
0.1
0.05
0
0 5 10 15 20 25 30 35 40 45 50
k
Probability Mass Function for a
Poisson Random Variable (3)
0.08
0.07
) e − λ λ k / k !, =
0.06
0.05
P ( X= k= k 0,1, 2,...
P(X=k)
0.04 µ X λ , and =
where = σ X2 λ=
(λ 20)
0.03
0.02
0.01
0
0 5 10 15 20 25 30 35 40 45 50
k
Properties of Probability Mass Functions
P ( X = xi ) > 0, i = 1,2,, n
n
∑ P( X
i =1
= xi ) = 1
P ( X ≤ x ) = FX ( x ) = ∑ P( X
all xi ≤ x
= xi )
MEAN(X) = ; µ X = E { X } = ∑
i =1
xi P ( X = xi )