1.bayes Theorem
1.bayes Theorem
1.bayes Theorem
Engineering Mathematics - IV
Module 3: Probability Distributions
Lecture: Probability, Conditional Probability and Bayes’ Theorem
Module 3: Probability:
3.1 Bayes’ Theorem. Discrete and continuous random variables - probability distribution and
probability density function.
3.2 Expectation, Variance
3.3 Probability Distributions - Poisson and Normal Distributions
Learning Objectives
At the end of this session, you will be able to
• Calculate Conditional Probability for an an event given another event has occurred
• Obtain Probabilities, unknown values using discrete and continuous probability distribu-
tions
Probability
Origins of the theory of Probability Probability theory has its origins in gambling. A
gambler Antoine Gombaud posed the problem of points:
2 players roll dice and the player to first win a certain number of points is the winner; the
problem was how to divide the money if the game stopped in the middle - to Mathematician
Blaise Pascal (1623 − 1662). Pascal collaborated with Pierre de Fermat (1601 − 1665) and
arrived at the ‘laws of chance’.
• James Bernoulli (1654 − 1705) wrote the (first) book –Theory of probability.
• Karl Pearson (1857 − 1936) worked on correlation analysis and introduced the chi-square
test.
• W.S. Gosset found the Student’s t-distribution used in exact or small sampling.
• Ronald A. Fisher (1890 − 1962) is called the Father of Statistics. He worked on Analysis
of Variance and Design of Experiments.
In probability theory, we deal with experiments that have different results known as out-
comes, (unlike the experiments that one conducts in the laboratory that have fixed outcome
for given inputs). Such experiments are known as random experiments.
Examples:
Definitions:
3. The set of all possible outcomes is known as the sample space of a given random experi-
ment. It is denoted by S or Ω. In the above example, .
5. Subsets of the sample space S having a single element are called elementary events.
6. Two events A and B are said to be equal if they consist of the same elements.
7. Events that cannot occur together are called mutually exclusive events. That is, the
occurrence of one of the events excludes the occurrence of the others. Mutually exclusive
events do not have elements in common.
Let a random experiment be performed. Note that the sample space can be finite, countably
infinite or uncountably infinite. The following examples will make it clear:
1. Finite:
(a) Random Experiment: Tossing a coin S = {H, T }
(b) Random Experiment: Throwing a die: Here S = {1, 2, 3, 4, 5, 6}
Let A = {1, 3, 4, 5} and B = {2, 6}
Since A and B are subsets of the sample space S, they are events. Also A and B have no
elements in common. That is, if one of them happens, the other does not happen. Thus A and
B are mutually exclusive events.
2. Countably Infinite:
(a) Random Experiment: Registering the number of telephone calls received at a telephone
exchange in two hours, say between 10 am and 12 noon
S = {0, 1, 2, ...}
(b) Random Experiment: Counting the number of printing errors in a book of 150 pages
S = {0, 1, 2, ...}
3. Uncountable:
(a) Random Experiment: A record of the height of III year engineering students in DBIT (in
cm):
S = [0, 300]
(b) Random Experiment: Calculation of the total distance travelled by a II year Engineering
student in DBIT from home to college (in km) everyday:
S = [0, 500]
Shortcomings of this definition: The limit in the definition may not exist.
Conditional Probability
The Conditional Probability of an event A assuming or given that another event M has
occurred, is denoted by (read as probability of A given M) and defined as:
P (A ∩ M )
P (A|M ) = , P (M ) ̸= 0
P (M )
Remark: The above definition gives,
P (A ∩ M ) = P (A|M ) × P (M )
Similarly, P (A ∩ M ) = P (M |A) × P (A)
Example:
1. Suppose a fair die is tossed. Let events A, B and C be respectively defined as follows:
A: The outcome is even;
B: The outcome is a prime number and
C: The outcome is greater than 2.
Find P (A|B), P (B|C), P (A|C). Is P (A|C) = P (A)?
Solution: We have
3 1
A = {2, 4, 6} ⇒ P (A) = =
6 2
3
B = {2, 3, 5} ⇒ P (B) = ,
6
4
C = {3, 4, 5, 6} ⇒ P (C) = .
6
1
Also A ∩ B = {2} ⇒ P (A ∩ B) = .
6
2
and B ∩ C = {3, 5} ⇒ P (B ∩ C) = .
6
2
and A ∩ C = {4, 6} ⇒ P (A ∩ C) = .
6
P (A ∩ B)
N ow, P (A|B) =
P (B)
1/6 1
= =
3/6 3
P (B ∩ C)
and P (B|C) =
P (C)
2/6 2 1
= = =
4/6 4 2
P (A ∩ C)
and P (A|C) =
P (C)
2/6 1
= =
4/6 2
Independence
Two events A and B are said to be independent if the occurrence of one does not affect the
probability of occurrence of the other. That is, if A and B are independent, then we should
have,
P (A|B) = P (A)
From the definition of conditional probability, this means
P (A ∩ B) = P (A) · P (B)
Solution:
Let A1 : Mail is spam and
A2 : Mail is not spam
Let B : Mail has a forged header
Then from the data
P (A1 ) = 0.6, P (A2 ) = 0.4, P (B|A1 ) = 0.9, and P (B|A2 ) = 0.2
Required to find P (A1 |B)
By Bayes’ theorem
P (B|A1 )P (A1 )
P (A1 |B) =
P (B|A1 )P (A1 ) + P (B|A2 )P (A2 )
0.6(0.9)
=
0.6(0.9) + 0.4(0.2)
= 0.8709
2. There are 6 true coins and 1 false coin with ‘head’ on both sides. A coin is chosen at
random and tossed 4 times. If ‘head’ occurs all the 4 times, what is the probability that
the false coin has been chosen and used?
Solution:
Let A1 : true coin is chosen and
A2 : false coin is chosen
Let B : ‘head’ occurs in all the 4 tosses
Then from the data
6 1 1 1
P (A1 ) = , P (A2 ) = , P (B|A1 ) = ( )4 = , and P (B|A2 ) = 1
7 7 2 16
Required to find P (A2 |B)
By Bayes’ theorem
P (B|A2 )P (A2 )
P (A2 |B) =
P (B|A1 )P (A1 ) + P (B|A2 )P (A2 )
1(1/7)
=
(1/16)(6/7) + 1(1/7)
= 0.7272
3. A man speaks truth 3 times out of 5 times. When a die is thrown he states that it gave
an ace. What is the probability that this event has actually happened.
Solution:
Let A1 : ’Ace’ appears in the die throw and
A2 : ’Ace’ does not appear in the die throw
Let B : Man says ‘ace’(i.e 1) is the outcome
Then from the data
1 5 3 2
P (A1 ) = , P (A2 ) = , P (B|A1 ) = , and P (B|A2 ) =
6 6 5 5
Required to find P (A1 |B)
By Bayes’ theorem
P (B|A1 )P (A1 )
P (A1 |B) =
P (B|A1 )P (A1 ) + P (B|A2 )P (A2 )
(3/5)(1/6)
=
(3/5)(1/6) + (2/5)(5/6)
= 0.2308
(ii) P (a one was transmitted given that a one was recd) = P (A1 |B1 )
P (B1 |A1 )P (A1 )
=
P (B1 |A0 )P (A0 ) + P (B1 |A1 )P (A1 )
0.85 × 0.7
=
0.1 × 0.3 + 0.85 × 0.7
0.595
= = 0.952
0.625
and
5. In a factory, four machines A1, A2, A3 and A4 produce 10%, 25%, 35% and 30% of the
items respectively. The percentage of defective items produced by them is 5%, 4%, 3%
and 2% respectively. An item is selected at random. What is the probability that it is
defective?
Practice Problems
1. A manufacturing plant makes radios that each contain an integrated circuit (IC), supplied
by three sources A, B and C. The probability that the IC in a radio came from one of
the sources is 1/3, same for all sources. IC’s are known to be defective with probabilities
0.001, 0.003 and 0.002 for sources A, B and C respectively. (a) What is the probability
that any given radio will contain a defective IC? (b) If a radio contains a defective IC,
find the probability that it came from source A. Repeat for sources B and C.
2. A mechanism consists of three paths A,B,C and probabilities of their failure are p, q, r
respectively. The mechanism works if there is no failure in any of these parts. Find the
probability that (i) the mechanism is working and (ii) the mechanism is not working.