Conditional Probability - Ch2

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Chapter 2

Conditional probability

In this chapter we develop the technique of conditional probability to deal with


cases where events are not independent.

2.1 What is conditional probability?


Alice and Bob are going out to dinner. They toss a fair coin ‘best of three’ to
decide who pays: if there are more heads than tails in the three tosses then Alice
pays, otherwise Bob pays.
Clearly each has a 50% chance of paying. The sample space is

S = {HHH, HHT, HT H, HT T, T HH, T HT, T T H, T T T },


and the events ‘Alice pays’ and ‘Bob pays’ are respectively

A = {HHH, HHT, HT H, T HH},


B = {HT T, T HT, T T H, T T T }.

They toss the coin once and the result is heads; call this event E. How should
we now reassess their chances? We have

E = {HHH, HHT, HT H, HT T },

and if we are given the information that the result of the first toss is heads, then E
now becomes the sample space of the experiment, since the outcomes not in E are
no longer possible. In the new experiment, the outcomes ‘Alice pays’ and ‘Bob
pays’ are

A ∩ E = {HHH, HHT, HT H},


B ∩ E = {HT T }.

23
24 CHAPTER 2. CONDITIONAL PROBABILITY

Thus the new probabilities that Alice and Bob pay for dinner are 3/4 and 1/4
respectively.
In general, suppose that we are given that an event E has occurred, and we
want to compute the probability that another event A occurs. In general, we can no
longer count, since the outcomes may not be equally likely. The correct definition
is as follows.
Let E be an event with non-zero probability, and let A be any event. The
conditional probability of A given E is defined as

P(A ∩ E)
P(A | E) = .
P(E)

Again I emphasise that this is the definition. If you are asked for the definition
of conditional probability, it is not enough to say “the probability of A given that
E has occurred”, although this is the best way to understand it. There is no reason
why event E should occur before event A!
Note the vertical bar in the notation. This is P(A | E), not P(A/E) or P(A \ E).
Note also that the definition only applies in the case where P(E) is not equal
to zero, since we have to divide by it, and this would make no sense if P(E) = 0.
To check the formula in our example:

P(A ∩ E) 3/8 3
P(A | E) = = = ,
P(E) 1/2 4
P(B ∩ E) 1/8 1
P(B | E) = = = .
P(E) 1/2 4

It may seem like a small matter, but you should be familiar enough with this
formula that you can write it down without stopping to think about the names of
the events. Thus, for example,

P(A ∩ B)
P(A | B) =
P(B)

if P(B) 6= 0.

Example A random car is chosen among all those passing through Trafalgar
Square on a certain day. The probability that the car is yellow is 3/100: the
probability that the driver is blonde is 1/5; and the probability that the car is
yellow and the driver is blonde is 1/50.
Find the conditional probability that the driver is blonde given that the car is
yellow.
2.2. GENETICS 25

Solution: If Y is the event ‘the car is yellow’ and B the event ‘the driver is blonde’,
then we are given that P(Y ) = 0.03, P(B) = 0.2, and P(Y ∩ B) = 0.02. So
P(B ∩Y ) 0.02
P(B | Y ) = = = 0.667
P(Y ) 0.03
to 3 d.p. Note that we haven’t used all the information given.
There is a connection between conditional probability and independence:
Proposition 2.1 Let A and B be events with P(B) 6= 0. Then A and B are indepen-
dent if and only if P(A | B) = P(A).

Proof The words ‘if and only if’ tell us that we have two jobs to do: we have to
show that if A and B are independent, then P(A | B) = P(A); and that if P(A | B) =
P(A), then A and B are independent.
So first suppose that A and B are independent. Remember that this means that
P(A ∩ B) = P(A) · P(B). Then
P(A ∩ B) P(A) · P(B)
P(A | B) = = = P(A),
P(B) P(B)
that is, P(A | B) = P(A), as we had to prove.
Now suppose that P(A | B) = P(A). In other words,
P(A ∩ B)
= P(A),
P(B)
using the definition of conditional probability. Now clearing fractions gives
P(A ∩ B) = P(A) · P(B),
which is just what the statement ‘A and B are independent’ means.
This proposition is most likely what people have in mind when they say ‘A
and B are independent means that B has no effect on A’.

2.2 Genetics
Here is a simplified version of how genes code eye colour, assuming only two
colours of eyes.
Each person has two genes for eye colour. Each gene is either B or b. A child
receives one gene from each of its parents. The gene it receives from its father
is one of its father’s two genes, each with probability 1/2; and similarly for its
mother. The genes received from father and mother are independent.
If your genes are BB or Bb or bB, you have brown eyes; if your genes are bb,
you have blue eyes.
26 CHAPTER 2. CONDITIONAL PROBABILITY

Example Suppose that John has brown eyes. So do both of John’s parents. His
sister has blue eyes. What is the probability that John’s genes are BB?

Solution John’s sister has genes bb, so one b must have come from each parent.
Thus each of John’s parents is Bb or bB; we may assume Bb. So the possibilities
for John are (writing the gene from his father first)

BB, Bb, bB, bb

each with probability 1/4. (For example, John gets his father’s B gene with prob-
ability 1/2 and his mother’s B gene with probability 1/2, and these are indepen-
dent, so the probability that he gets BB is 1/4. Similarly for the other combina-
tions.)
Let X be the event ‘John has BB genes’ and Y the event ‘John has brown
eyes’. Then X = {BB} and Y = {BB, Bb, bB}. The question asks us to calculate
P(X | Y ). This is given by
P(X ∩Y ) 1/4
P(X | Y ) = = = 1/3.
P(Y ) 3/4

2.3 The Theorem of Total Probability


Sometimes we are faced with a situation where we do not know the probability of
an event B, but we know what its probability would be if we were sure that some
other event had occurred.

Example An ice-cream seller has to decide whether to order more stock for the
Bank Holiday weekend. He estimates that, if the weather is sunny, he has a 90%
chance of selling all his stock; if it is cloudy, his chance is 60%; and if it rains, his
chance is only 20%. According to the weather forecast, the probability of sunshine
is 30%, the probability of cloud is 45%, and the probability of rain is 25%. (We
assume that these are all the possible outcomes, so that their probabilities must
add up to 100%.) What is the overall probability that the salesman will sell all his
stock?
This problem is answered by the Theorem of Total Probability, which we now
state. First we need a definition. The events A1 , A2 , . . . , An form a partition of the
sample space if the following two conditions hold:
(a) the events are pairwise disjoint, that is, Ai ∩ A j = 0/ for any pair of events Ai
and A j ;
(b) A1 ∪ A2 ∪ · · · ∪ An = S .
2.3. THE THEOREM OF TOTAL PROBABILITY 27

Another way of saying the same thing is that every outcome in the sample space
lies in exactly one of the events A1 , A2 , . . . , An . The picture shows the idea of a
partition.

A1 A2 . . . An

Now we state and prove the Theorem of Total Probability.

Theorem 2.2 Let A1 , A2 , . . . , An form a partition of the sample space with P(Ai ) 6=
0 for all i, and let B be any event. Then
n
P(B) = ∑ P(B | Ai ) · P(Ai ).
i=1

Proof By definition, P(B | Ai ) = P(B ∩ Ai )/P(Ai ). Multiplying up, we find that

P(B ∩ Ai ) = P(B | Ai ) · P(Ai ).

Now consider the events B ∩ A1 , B ∩ A2 , . . . , B ∩ An . These events are pairwise


disjoint; for any outcome lying in both B ∩ Ai and B ∩ A j would lie in both Ai and
A j , and by assumption there are no such outcomes. Moreover, the union of all
these events is B, since every outcome lies in one of the Ai . So, by Axiom 3, we
conclude that
n
∑ P(B ∩ Ai) = P(B).
i=1
Substituting our expression for P(B ∩ Ai ) gives the result.

 
B
 
A1 A2 . . . An

Consider the ice-cream salesman at the start of this section. Let A1 be the
event ‘it is sunny’, A2 the event ‘it is cloudy’, and A3 the event ‘it is rainy’. Then
A1 , A2 and A3 form a partition of the sample space, and we are given that

P(A1 ) = 0.3, P(A2 ) = 0.45, P(A3 ) = 0.25.


28 CHAPTER 2. CONDITIONAL PROBABILITY

Let B be the event ‘the salesman sells all his stock’. The other information we are
given is that

P(B | A1 ) = 0.9, P(B | A2 ) = 0.6, P(B | A3 ) = 0.2.

By the Theorem of Total Probability,

P(B) = (0.9 × 0.3) + (0.6 × 0.45) + (0.2 × 0.25) = 0.59.

You will now realise that the Theorem of Total Probability is really being used
when you calculate probabilities by tree diagrams. It is better to get into the habit
of using it directly, since it avoids any accidental assumptions of independence.
One special case of the Theorem of Total Probability is very commonly used,
and is worth stating in its own right. For any event A, the events A and A0 form a
partition of S . To say that both A and A0 have non-zero probability is just to say
that P(A) 6= 0, 1. Thus we have the following corollary:

Corollary 2.3 Let A and B be events, and suppose that P(A) 6= 0, 1. Then

P(B) = P(B | A) · P(A) + P(B | A0 ) · P(A0 ).

2.4 Sampling revisited


We can use the notion of conditional probability to treat sampling problems in-
volving ordered samples.

Example I have two red pens, one green pen, and one blue pen. I select two
pens without replacement.
(a) What is the probability that the first pen chosen is red?

(b) What is the probability that the second pen chosen is red?
For the first pen, there are four pens of which two are red, so the chance of
selecting a red pen is 2/4 = 1/2.
For the second pen, we must separate cases. Let A1 be the event ‘first pen red’,
A2 the event ‘first pen green’ and A3 the event ‘first pen blue’. Then P(A1 ) = 1/2,
P(A2 ) = P(A3 ) = 1/4 (arguing as above). Let B be the event ‘second pen red’.
If the first pen is red, then only one of the three remaining pens is red, so that
P(B | A1 ) = 1/3. On the other hand, if the first pen is green or blue, then two of
the remaining pens are red, so P(B | A2 ) = P(B | A3 ) = 2/3.
2.5. BAYES’ THEOREM 29

By the Theorem of Total Probability,

P(B) = P(B | A1 )P(A1 ) + P(B | A2 )P(A2 ) + P(B | A3 )P(A3 )


= (1/3) × (1/2) + (2/3) × (1/4) + (2/3) × (1/4)
= 1/2.

We have reached by a roundabout argument a conclusion which you might


think to be obvious. If we have no information about the first pen, then the second
pen is equally likely to be any one of the four, and the probability should be 1/2,
just as for the first pen. This argument happens to be correct. But, until your
ability to distinguish between correct arguments and plausible-looking false ones
is very well developed, you may be safer to stick to the calculation that we did.
Beware of obvious-looking arguments in probability! Many clever people have
been caught out.

2.5 Bayes’ Theorem


There is a very big difference between P(A | B) and P(B | A).
Suppose that a new test is developed to identify people who are liable to suffer
from some genetic disease in later life. Of course, no test is perfect; there will be
some carriers of the defective gene who test negative, and some non-carriers who
test positive. So, for example, let A be the event ‘the patient is a carrier’, and B
the event ‘the test result is positive’.
The scientists who develop the test are concerned with the probabilities that
the test result is wrong, that is, with P(B | A0 ) and P(B0 | A). However, a patient
who has taken the test has different concerns. If I tested positive, what is the
chance that I have the disease? If I tested negative, how sure can I be that I am not
a carrier? In other words, P(A | B) and P(A0 | B0 ).
These conditional probabilities are related by Bayes’ Theorem:

Theorem 2.4 Let A and B be events with non-zero probability. Then


P(B | A) · P(A)
P(A | B) = .
P(B)
The proof is not hard. We have

P(A | B) · P(B) = P(A ∩ B) = P(B | A) · P(A),

using the definition of conditional probability twice. (Note that we need both A
and B to have non-zero probability here.) Now divide this equation by P(B) to get
the result.
30 CHAPTER 2. CONDITIONAL PROBABILITY

If P(A) 6= 0, 1 and P(B) 6= 0, then we can use Corollary 17 to write this as


P(B | A) · P(A)
P(A | B) = .
P(B | A) · P(A) + P(B | A0 ) · P(A0 )
Bayes’ Theorem is often stated in this form.

Example Consider the ice-cream salesman from Section 2.3. Given that he sold
all his stock of ice-cream, what is the probability that the weather was sunny?
(This question might be asked by the warehouse manager who doesn’t know what
the weather was actually like.) Using the same notation that we used before, A1
is the event ‘it is sunny’ and B the event ‘the salesman sells all his stock’. We are
asked for P(A1 | B). We were given that P(B | A1 ) = 0.9 and that P(A1 ) = 0.3, and
we calculated that P(B) = 0.59. So by Bayes’ Theorem,
P(B | A1 )P(A1 ) 0.9 × 0.3
P(A1 | B) = = = 0.46
P(B) 0.59
to 2 d.p.

Example Consider the clinical test described at the start of this section. Suppose
that 1 in 1000 of the population is a carrier of the disease. Suppose also that the
probability that a carrier tests negative is 1%, while the probability that a non-
carrier tests positive is 5%. (A test achieving these values would be regarded as
very successful.) Let A be the event ‘the patient is a carrier’, and B the event ‘the
test result is positive’. We are given that P(A) = 0.001 (so that P(A0 ) = 0.999),
and that
P(B | A) = 0.99, P(B | A0 ) = 0.05.
(a) A patient has just had a positive test result. What is the probability that the
patient is a carrier? The answer is
P(B | A)P(A)
P(A | B) =
P(B | A)P(A) + P(B | A0 )P(A0 )
0.99 × 0.001
=
(0.99 × 0.001) + (0.05 × 0.999)
0.00099
= = 0.0194.
0.05094
(b) A patient has just had a negative test result. What is the probability that the
patient is a carrier? The answer is
P(B0 | A)P(A)
P(A | B0 ) =
P(B0 | A)P(A) + P(B0 | A0 )P(A0 )
2.6. ITERATED CONDITIONAL PROBABILITY 31

0.01 × 0.001
=
(0.01 × 0.001) + (0.95 × 0.999)
0.00001
= = 0.00001.
0.94095
So a patient with a negative test result can be reassured; but a patient with a posi-
tive test result still has less than 2% chance of being a carrier, so is likely to worry
unnecessarily.
Of course, these calculations assume that the patient has been selected at ran-
dom from the population. If the patient has a family history of the disease, the
calculations would be quite different.

Example 2% of the population have a certain blood disease in a serious form;


10% have it in a mild form; and 88% don’t have it at all. A new blood test is
developed; the probability of testing positive is 9/10 if the subject has the serious
form, 6/10 if the subject has the mild form, and 1/10 if the subject doesn’t have
the disease.
I have just tested positive. What is the probability that I have the serious form
of the disease?
Let A1 be ‘has disease in serious form’, A2 be ‘has disease in mild form’, and
A3 be ‘doesn’t have disease’. Let B be ‘test positive’. Then we are given that A1 ,
A2 , A3 form a partition and
P(A1 ) = 0.02 P(A2 ) = 0.1 P(A3 ) = 0.88
P(B | A1 ) = 0.9 P(B | A2 ) = 0.6 P(B | A3 ) = 0.1
Thus, by the Theorem of Total Probability,
P(B) = 0.9 × 0.02 + 0.6 × 0.1 + 0.1 × 0.88 = 0.166,
and then by Bayes’ Theorem,
P(B | A1 )P(A1 ) 0.9 × 0.02
P(A1 | B) = = = 0.108
P(B) 0.166
to 3 d.p.

2.6 Iterated conditional probability


The conditional probability of C, given that both A and B have occurred, is just
P(C | A ∩ B). Sometimes instead we just write P(C | A, B). It is given by
P(C ∩ A ∩ B)
P(C | A, B) = ,
P(A ∩ B)
32 CHAPTER 2. CONDITIONAL PROBABILITY

so
P(A ∩ B ∩C) = P(C | A, B)P(A ∩ B).
Now we also have
P(A ∩ B) = P(B | A)P(A),
so finally (assuming that P(A ∩ B) 6= 0), we have

P(A ∩ B ∩C) = P(C | A, B)P(B | A)P(A).

This generalises to any number of events:

Proposition 2.5 Let A1 , . . . , An be events. Suppose that P(A1 ∩ · · · ∩ An−1 ) 6= 0.


Then

P(A1 ∩ A2 ∩ · · · ∩ An ) = P(An | A1 , . . . , An−1 ) · · · P(A2 | A1 )P(A1 ).

We apply this to the birthday paradox.


The birthday paradox is the following statement:
If there are 23 or more people in a room, then the chances are better
than even that two of them have the same birthday.
To simplify the analysis, we ignore 29 February, and assume that the other 365
days are all equally likely as birthdays of a random person. (This is not quite true
but not inaccurate enough to have much effect on the conclusion.) Suppose that
we have n people p1 , p2 , . . . , pn . Let A2 be the event ‘p2 has a different birthday
1
from p1 ’. Then P(A2 ) = 1 − 365 , since whatever p1 ’s birthday is, there is a 1 in
365 chance that p2 will have the same birthday.
Let A3 be the event ‘p3 has a different birthday from p1 and p2 ’. It is not
straightforward to evaluate P(A3 ), since we have to consider whether p1 and p2
have the same birthday or not. (See below). But we can calculate that P(A3 |
2
A2 ) = 1 − 365 , since if A2 occurs then p1 and p2 have birthdays on different days,
and A3 will occur only if p3 ’s birthday is on neither of these days. So
1 2
P(A2 ∩ A3 ) = P(A2 )P(A3 | A2 ) = (1 − 365 )(1 − 365 ).

What is A2 ∩ A3 ? It is simply the event that all three people have birthdays on
different days.
Now this process extends. If Ai denotes the event ‘pi ’s birthday is not on the
same day as any of p1 , . . . , pi−1 ’, then

P(Ai | A1 , . . . , Ai−1 ) = 1 − i−1


365 ,
2.6. ITERATED CONDITIONAL PROBABILITY 33

and so by Proposition 2.5,


1 2
P(A1 ∩ · · · ∩ Ai ) = (1 − 365 )(1 − 365 ) · · · (1 − i−1
365 ).

Call this number qi ; it is the probability that all of the people p1 , . . . , pi have
their birthdays on different days.
The numbers qi decrease, since at each step we multiply by a factor less than 1.
So there will be some value of n such that

qn−1 > 0.5, qn ≤ 0.5,

that is, n is the smallest number of people for which the probability that they all
have different birthdays is less than 1/2, that is, the probability of at least one
coincidence is greater than 1/2.
By calculation, we find that q22 = 0.5243, q23 = 0.4927 (to 4 d.p.); so 23
people are enough for the probability of coincidence to be greater than 1/2.
Now return to a question we left open before. What is the probability of the
event A3 ? (This is the event that p3 has a different birthday from both p1 and p2 .)
2
If p1 and p2 have different birthdays, the probability is 1 − 365 : this is the
calculation we already did. On the other hand, if p1 and p2 have the same birthday,
1
then the probability is 1 − 365 . These two numbers are P(A3 | A2 ) and P(A3 | A02 )
respectively. So, by the Theorem of Total Probability,

P(A3 ) = P(A3 | A2 )P(A2 ) + P(A3 | A02 )P(A02 )


2 1 1 1
= (1 − 365 )(1 − 365 ) + (1 − 365 ) 365
= 0.9945

to 4 d.p.

Problem How many people would you need to pick at random to ensure that
the chance of two of them being born in the same month are better than even?
Assuming all months equally likely, if Bi is the event that pi is born in a dif-
ferent month from any of p1 , . . . , pi−1 , then as before we find that

P(Bi | B1 , · · · , Bi−1 ) = 1 − i−1


12 ,

so
1 2
P(B1 ∩ · · · ∩ Bi ) = (1 − 12 )(1 − 12 )(1 − i−1
12 ).
We calculate that this probability is

(11/12) × (10/12) × (9/12) = 0.5729


34 CHAPTER 2. CONDITIONAL PROBABILITY

for i = 4 and

(11/12) × (10/12) × (9/12) × (8/12) = 0.3819

for i = 5. So, with five people, it is more likely that two will have the same birth
month.

A true story. Some years ago, in a probability class with only ten students, the
lecturer started discussing the Birthday Paradox. He said to the class, “I bet that
no two people in the room have the same birthday”. He should have been on safe
ground, since q11 = 0.859. (Remember that there are eleven people in the room!)
However, a student in the back said “I’ll take the bet”, and after a moment all the
other students realised that the lecturer would certainly lose his wager. Why?
(Answer in the next chapter.)

2.7 Worked examples


Question Each person has two genes for cystic fibrosis. Each gene is either N
or C. Each child receives one gene from each parent. If your genes are NN or NC
or CN then you are normal; if they are CC then you have cystic fibrosis.

(a) Neither of Sally’s parents has cystic fibrosis. Nor does she. However, Sally’s
sister Hannah does have cystic fibrosis. Find the probability that Sally has
at least one C gene (given that she does not have cystic fibrosis).

(b) In the general population the ratio of N genes to C genes is about 49 to 1.


You can assume that the two genes in a person are independent. Harry does
not have cystic fibrosis. Find the probability that he has at least one C gene
(given that he does not have cystic fibrosis).

(c) Harry and Sally plan to have a child. Find the probability that the child will
have cystic fibrosis (given that neither Harry nor Sally has it).

Solution During this solution, we will use a number of times the following prin-
ciple. Let A and B be events with A ⊆ B. Then A ∩ B = A, and so

P(A ∩ B) P(A)
P(A | B) = = .
P(B) P(B)

(a) This is the same as the eye colour example discussed earlier. We are given
that Sally’s sister has genes CC, and one gene must come from each parent. But
2.7. WORKED EXAMPLES 35

neither parent is CC, so each parent is CN or NC. Now by the basic rules of
genetics, all the four combinations of genes for a child of these parents, namely
CC,CN, NC, NN, will have probability 1/4.
If S1 is the event ‘Sally has at least one C gene’, then S1 = {CN, NC,CC}; and
if S2 is the event ‘Sally does not have cystic fibrosis’, then S2 = {CN, NC, NN}.
Then
P(S1 ∩ S2 ) 2/4 2
P(S1 | S2 ) = = = .
P(S2 ) 3/4 3
(b) We know nothing specific about Harry, so we assume that his genes are
randomly and independently selected from the population. We are given that the
probability of a random gene being C or N is 1/50 and 49/50 respectively. Then
the probabilities of Harry having genes CC, CN, NC, NN are respectively (1/50)2 ,
(1/50) · (49/50), (49/50) · (1/50), and (49/50)2 , respectively. So, if H1 is the
event ‘Harry has at least one C gene’, and H2 is the event ‘Harry does not have
cystic fibrosis’, then

P(H1 ∩ H2 ) (49/2500) + (49/2500) 2


P(H1 | H2 ) = = = .
P(H2 ) (49/2500) + (49/2500) + (2401/2500) 51

(c) Let X be the event that Harry’s and Sally’s child has cystic fibrosis. As in
(a), this can only occur if Harry and Sally both have CN or NC genes. That is,
X ⊆ S3 ∩ H3 , where S3 = S1 ∩ S2 and H3 = H1 ∩ H2 . Now if Harry and Sally are
both CN or NC, these genes pass independently to the baby, and so

P(X) 1
P(X | S3 ∩ H3 ) = = .
P(S3 ∩ H3 ) 4

(Remember the principle that we started with!)


We are asked to find P(X | S2 ∩ H2 ), in other words (since X ⊆ S3 ∩ H3 ⊆
S2 ∩ H2 ),
P(X)
.
P(S2 ∩ H2 )
Now Harry’s and Sally’s genes are independent, so

P(S3 ∩ H3 ) = P(S3 ) · P(H3 ),


P(S2 ∩ H2 ) = P(S2 ) · P(H2 ).

Thus,

P(X) P(X) P(S3 ∩ H3 )


= ·
P(S2 ∩ H2 ) P(S3 ∩ H3 ) P(S2 ∩ H2 )
36 CHAPTER 2. CONDITIONAL PROBABILITY

1 P(S1 ∩ S2 ) P(H1 ∩ H2 )
= · ·
4 P(S2 ) P(H2 )
1
= · P(S1 | S2 ) · P(H1 | H2 )
4
1 2 2
= · ·
4 3 51
1
= .
153
I thank Eduardo Mendes for pointing out a mistake in my previous solution to
this problem.

Question The Land of Nod lies in the monsoon zone, and has just two seasons,
Wet and Dry. The Wet season lasts for 1/3 of the year, and the Dry season for 2/3
of the year. During the Wet season, the probability that it is raining is 3/4; during
the Dry season, the probability that it is raining is 1/6.

(a) I visit the capital city, Oneirabad, on a random day of the year. What is the
probability that it is raining when I arrive?

(b) I visit Oneirabad on a random day, and it is raining when I arrive. Given this
information, what is the probability that my visit is during the Wet season?

(c) I visit Oneirabad on a random day, and it is raining when I arrive. Given this
information, what is the probability that it will be raining when I return to
Oneirabad in a year’s time?

(You may assume that in a year’s time the season will be the same as today but,
given the season, whether or not it is raining is independent of today’s weather.)

Solution (a) Let W be the event ‘it is the wet season’, D the event ‘it is the dry
season’, and R the event ‘it is raining when I arrive’. We are given that P(W ) =
1/3, P(D) = 2/3, P(R | W ) = 3/4, P(R | D) = 1/6. By the ToTP,

P(R) = P(R | W )P(W ) + P(R | D)P(D)


= (3/4) · (1/3) + (1/6) · (2/3) = 13/36.

(b) By Bayes’ Theorem,

P(R | W )P(W ) (3/4) · (1/3) 9


P(W | R) = = = .
P(R) 13/36 13
2.7. WORKED EXAMPLES 37

(c) Let R0 be the event ‘it is raining in a year’s time’. The information we are
given is that P(R ∩ R0 | W ) = P(R | W )P(R0 | W ) and similarly for D. Thus

P(R ∩ R0 ) = P(R ∩ R0 | W )P(W ) + P(R ∩ R0 | D)P(D)


89
= (3/4)2 · (1/3) + (1/6)2 · (2/3) = ,
432
and so
P(R ∩ R0 ) 89/432 89
P(R0 | R) = = = .
P(R) 13/36 156

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy