Solutions To Quizzes and The Test, Chapter 01 2
Solutions To Quizzes and The Test, Chapter 01 2
Solutions To Quizzes and The Test, Chapter 01 2
CHAPTER 1
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 1.2
1. Suppose that jabk represents the order in which Jody is standing on the far left, Ann
next to Jody, Bill next to Ann, and Karl next to Bill. Define all other sample points
(possible outcomes) similarly. The event that, on the line, males and females alternate
is
E = {jbak, jkab, abjk, akjb, bjka, bakj, kjba, kabj}.
100
!
2. (a) E1E2 · · · E100. (b) E1 E2 · · · Ei−1 Eic Ei+1 · · · E100.
i=1
3.
(E ∪ F )(F ∪ G)(EG ∪ F c ) = (F ∪ E)(F ∪ G)(EG ∪ F c )
= (F ∪ EG)(EG ∪ F c )
= (EG ∪ F )(EG ∪ F c )
= EG ∪ F F c = EG ∪ ∅ = EG.
4. The event that a signal fed to the input is transmitted to the output is
F = E1 E2 E4 E7 ∪ E1 E2 E5 E6 E7 ∪ E1 E3 E5 E6 E7 ∪ E1 E3 E4 E7 .
Section 1.4
2. Let A be the event that Zack reads the book assigned on the given period. Let B
be the event that he completes his philosophy paper on or before the deadline. The
probability that he ends up meeting the deadline only for one assignment is
$ %
P (A ∪ B) − AB = P (A ∪ B) − P (AB) = 0.95 − 0.6 = 0.35.
Note that this equality holds since AB ⊆ A ∪ B.
3. Let F , E, and I be the events that a randomly selected American traveler to Europe
travels to France, Englad, and Italy, respectively. We have
" #
P F ∪ E ∪ I = P (F ) + P (E) + P (I) − P (F E) − P (F I) − P (EI) + P (EF I).
Therefore,
2 1 1 1 1 1 1
= + + − − − + P (EF I).
3 3 2 4 5 5 5
This gives that P (EF I) = 11/60.
Chapter 1 Solutions to Self-Quiz Problems 3
Section 1.7
1. No, it does not! In the experiment of choosing a random point from the interval
(0, 1), let A be the event that the outcome is either 1/3 or 1/2, and B be the event that
the outcome is either 1/5 or 1/3. Then
" AB# = {1/3}. So A and B are not mutually
exclusive. However, P (AB) = P {1/3} = 0.
2. For the experiment of choosing a point at random from the interval [0, 1], let E n =
&1 1 1 2 '
− , + , n ≥ 1. Applying the Continuity of Probability Function
3 n+2 3 n+2
to
En ’s, show that P (1/3 is selected) = 0.
Chapter 1 Solutions to Self-Test Problems 4
1. (a) By symmetry, the probability that Crispin ends up paying for dinner is 1/3.
The probability that no more than one round of flips is necessary is equal to the
probability that HHH and TTT do not occur in the first round, which is 6/8 = 3/4.
2. Let x1, x2, x3, x4 , and x5 be the production level, in hundreds, for baseballs, tennis
balls, softballs, basketballs, and soccer balls, respectively. A sample space for the
production levels of these balls by this manufacturer is
( )
S = (x1 , x2 , . . . , x5) : 8 ≤ xi ≤ 13, 1 ≤ i ≤ 5 .
The event that on this day, the difference between the number of baseballs and soccer
balls produced will exceed 100 is
( )
E = (x1 , x2, . . . , x5 ) ∈ S : |x1 − x5 | ≤ 1
( )
= (x1 , x2, . . . , x5 ) ∈ S : x5 − 1 ≤ x1 ≤ x5 + 1 .
S = {A1 A2 A3 , Ac1 A2 A3 , A1 Ac2 A3 , A1 A2 Ac3 , Ac1 Ac2 A3 , Ac1 A2 Ac3 , A1 Ac2 Ac3 , Ac1 Ac2 Ac3 }.
(b) The event that the system is not operative at the random time is
E = {Ac1 A2 A3 , A1 Ac2 A3 , A1 A2 Ac3 , Ac1 Ac2 A3 , Ac1 A2 Ac3 , A1 Ac2 Ac3 , Ac1 Ac2 Ac3 }.
4. Let A be the event that a randomly selected person from the given age group at this
geographic area suffers from Parkinson’s disease. Let B be the even that he or she
suffers from Alzheimer’s disease. The desired probability is
" # * +
P (Ac B c ) = 1 − P A ∪ B = 1 − P (A) + P (B) − P (AB)
= 1 − (0.134 + 0.113 − 0.0226) = 0.7756.
This shows that 77.56% of the people in the given age group in the given geographic
area have neither of the two diseases.
" #
5. We have that E cF = E ∪ F − E and E ⊆ E ∪ F . So
" # " # * +
P E cF = P E ∪F −P (E) = 1−P (E c F c ) −P (E) = (1−0.35)−0.4 = 0.25.
" #c
Note that by DeMorgan’s first law, E ∪ F = E cF c .
Chapter 1 Solutions to Self-Test Problems 5
6. Drawing a Venn diagram, it can be easily seen that the answer is 0.02. Here is a
formal solution: Let B , C, and E be the events that the student received an A in
biology, calculus, and English, respectively. We are interested in P (CB c E c ). Now
CB c E c = C − C(B ∪ E) and C(B ∪ E) ⊆ C. So by Theorem 1.5,
" #
P (CB c E c ) = P (C) − P C(B ∪ E) = P (C) − P (CB ∪ CE)
= P (C) − P (CB) − P (CE) + P (CBE)
= 0.18 − 0.10 − 0.13 + 0.07 = 0.02.
So the probability is .02 that a randomly selected freshman from this college, last
semester, received an A in calculus, but not in biology and not in English.
,∞
7. Note that E 1 ⊇ E2 ⊇ · · · ⊇ En ⊇ En+1 ⊇ · · · and limn→∞ En = i=1 Ei = {1}.
So by the Continuity of the Probability Function (Theorem 1.8), we have
" # " # 2n + 1 2
P {1} = P lim En = lim P (En ) = lim = .
n→∞ n→∞ n→∞ 3n 3
8. For 0 ≤ i ≤ 5, let Ai be the event that exactly i of these customers buy an Android
smartphone. We have that
5
- 2
- 5
-
P (Ai ) = P (Ai ) + P (Ai ) = 1.
i=0 i=0 i=3
.
Now 5i=3 P (Ai ) = 0.54, since it is.the probability of at least three customers pur-
chasing an Android smartphone. So 2i=0 P (Ai ), the probability that at most two of
the customers purchase such a phone is, 1 − 0.54 = 0.46.
9. Let L be the event that a randomly selected town’s drinking water is contaminated
" #
with lead. Let A be the event that it is contaminated with asbestos fibers. P Lc Ac =
0.13 implies that
" # " #
P L ∪ A = 1 − P Lc Ac = 1 − 0.13 = 0.87.
Thus,
P (L) + P (A) − P (LA) = 0.87,
or 0.32 + 0.43 − P (LA) = 0.87. This gives P (LA) = 0.12. The probability that,
in a randomly selected town, the drinking water supply is contaminated with exactly
one of these two impurities is
Thus in 51% of the towns of this country, the drinking water supplies are contami-
nated with exactly one of the two impurities.
Chapter 1 Solutions to Self-Test Problems 6
10. Let E be the event that a randomly selected traveler visiting Paris on that certain week
took a trip to the Eiffel Tower. Let L and N be the events that he or she visited the
Louvre Museum and Notre Dame Cathedral, respectively. The desired probability is
"
P (E c Lc N c ) = 1 − P E ∪ L ∪ N )
* +
= 1 − P (E) + P (L) + P (N ) − P (EL) − P (EN ) − P (LN ) + P (ELN )
$1 1 1 1 1 1 % 7
=1− + + − − − +0 = .
3 2 3 4 4 4 12
SOLUTIONS TO SELF-QUIZZES AND SELF-TESTS
CHAPTER 2
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 2.2
1. There are 365 × 365 × 365 possibilities for the birth dates of three randomly selected
people. Of all these possibilities only in 365 cases all three have the same birthday.
So the desired probability is
365 1
= ≈ 0.0000075.
365 × 365 × 365 133, 225
That is, the odd is 1 in 133,225 that these three randomly selected professors have the
same birthday.
2. For each member of the board, there are three possibilities: he or she will not attend,
will attend alone, and will attend with his or her spouse. so the answer is 3 12 =
531, 441.
3. Clearly, whatever the birthdays of the first five people in line, does not affect the
probability we are interested in. The problem is equivalent to finding the probability
that, of three randomly selected people, at least two have the same birthday. The
solution is
365 × 364 × 363
1 − P (no two have the same birthday) = 1 − ≈ 0.0082.
365 × 365 × 365
278
4. The answer is ≈ 0.252.
288
Section 2.3
1. There are 104 choices for the last four digits of the faculty member’s phone number.
We are interested in the event that the remaining 4 digits of his or her phone number
be distinct digits from {0, 1, 3, 4, 5, 6, 9}; so the desired probability is
7 P4 7!/3! 21
= = ≈ 0.084.
10 4 104 250
7!
4. (a) 7 P4 = = 840.
(7 − 4)!
4 P2 · 5 P2 2
(b) = = 0.286
P
7 4 7
Section 2.4
! "5
12
1. = 311, 620, 419, 551, 232.
5
1 1
2. The answer to (a) is ! " = ≈ 0.029.
7 35
4
! "! "
3 3
3 1 3
The answer to (b) is ! " = ≈ 0.086.
7 35
4
3. Suppose that passengers occupied the seats randomly. Let O stand for an occupied
chair and − for an empty chair. Since only in two cases, O−O−O−O− and O−O−
O−O, the four passengers could!take seats next to empty ones, the probability of what
# 8"
the sociologist observed is 2 ≈ 0.029. Based on this very small probability,
4
the sociologist can conclude that passengers avoid taking seats next to occupied ones
when possible.
where the division by 8! is necessary since the order at which the individuals are
divided into 8 groups is immaterial.
Chapter 2 Solutions to Self-Test Problems 4
! "
23
1. There are choices for the two students we want to share a birthday, and 365
2
possibilities for they shared birthday. For the remaining 21 students, since no two
364!
have the same birthday, the number pf possible birthdays is 364 P21 = =
(364 − 21)!
364!
. So the answer is
343!
! "
23 $23%
· 365 · 364 P21
2 · 365 · 364!
= 2 23 ≈ 0.363.
365 23 365 · 343!
! "
6
· 23 · 1
3 160
2. The solution is = ≈ 0.073.
3 7 2187
! "! " ! "! "
52 26 4 48
3. In ways the cards can be dealt among the two teams. In of
26 26 ! "! 3 " 23
4 48
these possibilities, three aces are in the hands of one team and in of them
3 23
three aces are in the hands of the other team. So the desired probability is
! "! "
4 48
2·
3 23 416
! "! " = ≈ 0.499.
52 26 833
26 26
(b) In only 3 ways the three visually impaired students can end up in the same class.
So the desired probability is
! "&! "
45! 48! 3 · 16 · 16 · 16 128
3· = = ≈ 0.118.
15! 15! 15! 16! 16! 16! 48 · 47 · 46 1081
! " ! "
n n−k
9. Clearly, the answer to (a) is . The answer to (b) is , and the answer
m m−k
'k ! "! "
k n−k
to (c) is , where ? must be determined. Note that i must be at
i m−i
i=?
least !, and we must have m − i ≤ n − k or, equivalently, i ≥ m + k − n. So
i ≥ max(!, m + k − n), and hence the answer is
k
' ! "! "
k n−k
.
i m−i
i = max(!, m + k − n)
Note that ! < k is given and m+k−n = k+(m−n) < k. So max(!, m+k−n) < k.
10. Let A1 , A2 , A3, and A4 be the events that there is no professor, no associate professor,
no assistant professor, and no instructor in the committee, respectively. The desired
probability is
CHAPTER 3
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 3.1
1. Given that the first 4 cards the player is dealt are hearts, reducing the sample space,
we have that the answer is ! "
39
9
! " ≈ 0.126.
48
9
2. Let A be the event that the outcome is 1. Let O be the event that it is an odd number.
The desired probability is
P (AO) P (A) 0.27
P (A | O) = = = ≈ 0.55.
P (O) P (O) 0.27 + 0.17 + 0.05
3. Reduce the sample space: Marlon chooses from six dramas and seven comedies two
at
! random.
" ! "What is the probability that they are both comedies? The answer is
7 # 13
= 0.269.
2 2
P (F E)
4. To find P (F | E) = , first we need to calculate P (E). Now P (E | F ) =
P (E)
P (EF )
gives that 0.46 = 0.23/P (F ). So P (F ) = 0.23/0.46 = 0.5. On the other
P (F )
hand, from $ %
P E ∪ F = P (E) + P (F ) − P (EF ),
we have that
0.67 = P (E) + 0.5 − 0.23,
which gives P (E) = 0.4. Therefore,
P (F E) 0.23
P (F | E) = = = 0.575.
P (E) 0.4
Section 3.2
1. Let R1B2 R3 be the event that the first ball drawn is red, the second one blue, and the
third one red again. Define B1 R2 B3 similarly. The desired probability is
$
P R1 B2 R3 ∪ B1 R2 B3 ) = P (R1 B2 R3 ) + P (B1 R2 B3 )
= P (R1 )P (B2 | R1 )P (R3 | R1 B2 ) + P (B1 )P (R2 | B1 )P (B3 | B1 R2 )
4 6 3 6 4 5 4
= · · + · · = ≈ 0.27.
10 9 8 10 9 8 15
Chapter 3 Solutions to Self-Quiz Problems 3
2. For 1 ≤ i ≤ 4, let Ni be the event that, on the day selected, the ith item produced by
the manufacturer is non-defective. The desired probability is
Section 3.3
1. No, she does not. Let A be the event that Nicole draws a red ball. Let R be the event
that Natalie draws a red ball, and let B be the event that she draws a blue ball. Then
2. For i = 1, 2, 3, let Ai be the events that the ith die is selected. Let B be the event that
the outcome is 6. We have
3. Let M be the event that a randomly selected student of the college is male, F be the
event that the student is female, and A be the event that he or she participates in a
study abroad program. We are interested in P (A | F ). By the law of total probability,
P (A) = P (A | M )P (M ) + P (A | F )P (F ).
So
0.70 = (0.55)(0.60) + P (A | F )(0.40).
Solving this equation for P (A | F ), we obtain P (A | F ) = 0.925. Therefore 92.5%
of female students participate in a study abroad program.
Section 3.4
1. Let A be the event that the student scored 1200 or higher. Let B be the event that he
or she attended the prep course. The desired probability is
P (A | B)P (B)
P (B | A) =
P (A | B)P (B) + P (A | B c )P (B c )
(0.35)(17/60)
= ≈ 0.41.
(0.35)(17/60) + (0.2)(43/60)
Chapter 3 Solutions to Self-Quiz Problems 4
2. Let E be the event that Eileen’s coin is gold. Let B be the event that Bernice’s coin
is gold. The desired probability is
P (B | E)P (E)
P (E | B) =
P (B | E)P (E) + P (B | E c )P (E c)
2 3
· 3
= 9 10 = ≈ 0.21.
2 3 3 7 14
· + ·
9 10 9 10
3. Let F be the event that the customer filled his tank. Let A 1 , A2 , and A3 be the events
that he used 87 octane, 91 octane, and 93 octane gasoline, respectively. The desired
probability is
P (F | A3 )P (A3 )
P (A3 | F ) =
P (F | A1 )P (A1 ) + P (F | A2 )P (A2 ) + P (F | A3 )P (A3 )
(0.95)(0.10)
= ≈ 0.13.
(0.70)(0.85) + (0.85)(0.05) + (0.95)(0.10)
4. Let H be the event that Harris got a passing grade on his last exam. Let T be the
event that his exam was graded by a TA. The desired probability is
P (H | T )P (T )
P (T | H) =
P (H | T )P (T ) + P (H | T c )P (T c )
(0.86)(0.75)
= ≈ 0.77.
(0.86)(0.75) + (0.78)(0.25)
Section 3.5
1. For 1 ≤ i ≤ 5, let Ri be the event that during the next 5 trading days, on the ith
trading day, the DJIA rises. Let F i be the event that on that day it falls. The desired
probability is
P (R1 )P (F2 )P (R3 )P (F4 )P (R5 ) + P (F1 )P (R2 )P (F3 )P (R4 )P (F5 )
= (0.52)3(0.48)2 + (0.48)3(0.52)2 ≈ 0.062.
2. We need to find the smallest n for which 1−(0.4) n ≥ 0.95, or, equivalently, (0.4) n ≤
ln(0.05)
0.05. This gives n ≥ ≈ 3.27. Therefore, the minimum number of such
ln(0.4)
missiles to be fired to have a probability of at least 0.95 of hitting the target is 4.
3. Let A be the event that the first inspector finds the defect and B be the event that the
second inspector finds the defect. The desired probability is
P (AB c ∪ Ac B) = P (AB c ) + P (Ac B) = P (A)P (B c ) + P (Ac )P (B)
= (0.93)(0.07) + (0.07)(0.93) ≈ 0.13.
Chapter 3 Solutions to Self-Quiz Problems 5
4. Let O, E, and R be the events that the patient needed orthodontic, extraction, and
root canal services, respectively. The desired probability is
$ %
P O ∪ E ∪ R = 1 − P (Oc E c Rc ) = 1 − P (Oc )P (E c)P (Rc )
= 1 − (1 − 0.18)(1 − 0.12)(1 − 0.10) ≈ 0.35.
Chapter 3 Solutions to Self-Test Problems 6
1. For i ≥ 1, let Ai be the event that all border issues between Chernarus and Carpathia
will be resolved in the ith summit. The event that more than 4 summits are nec-
essary to successfully resolve all the border disputes between the two countries is
Ac1 Ac2 Ac3 Ac4 . So the desired probability is
1 − P (Ac1 Ac2 Ac3 Ac4 ) = 1 − P (Ac1 )P (Ac2 | Ac1 )P (Ac3 | Ac1 Ac2 )P (Ac4 | Ac1 Ac2 Ac3 )
3 1 1 1 1 31
= 1− · · · = 1− = ≈ 0.97.
4 2 3 4 32 32
2. Let E1, E2 , and E3 be the events that A 1 , A2 , and A3 are operative, respectively. Let
F1 and F2 be the events that B 1 and B2 are operative, respectively. By independence,
the system functions with probability
$ % & '
P (F1 F2 )P E1 ∪ E2 ∪ E3 = P (F1 )P (F2 ) 1 − P (E1c E2cE3c )
& '
= (0.85)2 1 − (1 − 0.85)3 ≈ 0.72.
3. For 1 ≤ i ≤ 3, let Ai be the event that the ith person in line does not choose Hunter’s
favorite seat. The desired probability is
P (A | B)P (B)
P (B | A) =
P (A | B)P (B) + P (A | B c )P (B c )
(1/6)3 · (1/2)
= ≈ 0.166.
(1/6)3 · (1/2) + (2/7)3 · (1/2)
5. For i = 0, 1, 2, let Ei be the event that the organism gives birth to i new individuals.
Let F be the event that the second generation offspring of the living organism consists
of at least one individual. Note that
7. For 1 ≤ i ≤ 4, let Ai be the event that the ith circle scratched has the dollar sign
Chapter 3 Solutions to Self-Test Problems 8
underneath. Let L be the event that Harlan loses. The desired probability is
8. Let A be the event that Keith has mesothelioma and B be the event that the test result
comes back positive. We are given that P (A) = 0.0033, P (B | A) = 0.92, and
P (B c | Ac ) = 0.92. Therefore, P (B | Ac ) = 1 − 0.92 = 0.08. We are interested in
P (A | B). By Bayes’ formula,
P (B | A)P (A)
P (A | B) =
P (B | A)P (A) + P (B | Ac )P (Ac )
(0.92)(0.0033)
= ≈ 0.0367,
(0.92)(0.0033) + (0.08)(1 − 0.0033)
9. (a) For 0 ≤ i ≤ n, let Ai be the event that the ith person responds positively
to Vincent’s request; let B i be the event that the ith person is a close bone marrow
match. We are given that P (Ai ) = p1 and P (Bi | Ai ) = p2 , 1 ≤ i ≤ n. Thus
Note that A ci ∪ Bic is the event that the ith person either responses negatively to
Vincent or is not a bone marrow match. So 1 − p 1 p2 is the probability that the ith
person will not be a donor. Now whether or not a person responds positively to
Vincent’s request and whether or not he or she is a close bone marrow match are
independent of other people’s responses and their chances to be a close bone marrow
match. So the probability is (1 − p 1 p2 )n that none of the n people will end up being
a donor. This implies that the probability is 1 − (1 − p 1 p2 )n that at least one donor is
found among this group of n people.
( 1 )1000
(b) The answer is 1 − 1 − · 0.70 ≈ 0.0676. If he asks 100,000
10, 000
people, the probability increases to almost one. In reality, patients in need of bone
marrow transplants should search in a bone marrow registry of several million genet-
ically tested potential donors, who have already agreed to donate if asked. The actual
Chapter 3 Solutions to Self-Test Problems 9
probability that two unrelated persons have a close bone marrow match can be less
than 1 in 10,000. For example, for some white Americans, it is 1 in a million and for
some African-Americans it is less than 1 in 100,000.
10. Clearly,
∞ ( )
5 2i−2 ( 1 ) 1 * ( 5 )2i−2 ( 1 )
∞
* * ∞
P (H) = P (Ai ) = = +
6 6 6 6 6
i=1 i=1 i=2
1 1 ( 5 )−2 *∞ ( )
5 2i 1 1 ( 5 )−2 * ( 25 )i
∞
= + · = + ·
6 6 6 6 6 6 6 36
i=2 i=2
1 1 ( 5 )−2 (25/36)2 6
= + · · = .
6 6 6 1 − (25/36) 11
Note that if the game does not start with Hillary rolling the die first, and the first
person to roll the die is selected at random, then, by symmetry, the answer would be
1/2.
SOLUTIONS TO SELF-QUIZZES AND SELF-TESTS
CHAPTER 4
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 4.2
!∞ !∞
k 1
1. We must have = 1. However, , the harmonic series, is divergent. So for
n n
n=1 n=1
!∞
k
no value of k, can be 1 (or any other finite number).
n
n=1
3. The set of possible values of X is E = {2, 3, . . ., 50}. Let p be the probability mass
function of X. We have
$ %
365 × 364 × · · · × 365 − (i − 2) × (i − 1)
p(i) = P (X = i) = , i ∈ E.
365i
Section 4.3
1. (a) Let X be the number of claims that will be filled next week. The desired prob-
ability is
23 40
P (X ≥ 1) = 1 − P (X = 0) = 1 − p(0) = 1 − = ≈ 0.63.
63 63
(b) Given that there were no more than 5 claims filed two weeks ago, the probabil-
ity that there were at least 4 claims filed is
Section 4.4
3. Let X be the amount of money a random person with one ticket will win. Clearly,
300 50 1
E(X) = 5 · + 50 · + 1000 · ≈ 0.33.
15, 000 15, 000 15, 000
Therefore, the fair price for each ticket is approximately 33 cents.
x 2 3 5 7
p(x) 0.2 0.1 0.5 0.2
Section 4.5
1. We have
1 2 3 4 5 6
E(X) = 2 · +3· +4· +5· +6· +7·
36 36 36 36 36 36
5 4 3 2 1
8· +9· + 10 · + 11 · + 12 · = 7,
36 36 36 36 36
1 2 3 4 5 6
E(X 2) = 22 · + 32 · + 42 · + 52 · + 62 · + 72 ·
36 36 36 36 36 36
5 4 3 2 1 1974
82 · + 92 · + 102 · + 112 · + 122 · = ,
36 36 36 36 36 36
$ %2 1974 35 )
Var(X) = E(X 2) − E(X) = − 49 = ≈ 5.83. σX = 35/6 ≈ 2.42.
36 6
$ %
2. E (X − 3)(4 − X) = −15 implies that E(−X 2 + 7X − 12) = −15 or −E(X 2) +
7E(X)−12 = −15. Substituting 3 for E(X), this gives E(X 2) = 24. So Var(X) =
$ %2
E(X 2)− E(X) = 24−9 = 15. Therefore, Var(−3X +8) = 9Var(X) = 9·15 =
135.
Chapter 4 Solutions to Self-Test Problems 4
1. Note that in a year, leap or non-leap, each of April, June, September, and November
has 30 days, and each of January, March, May, July, August, October, and December
has 31 days. In leap years, February has 29 days, whereas in non-leap years, it has
28 days. So
29
with probability 1/12
X= 30 with probability 4/12
31 with probability 7/12.
Therefore,
1 4 7 366 61
E(X) = 29 · + 30 · + 31 · = = = 30.5.
12 12 12 12 2
1 4 7 2792
E(X 2) = 292 · + 302 · + 312 · = ;
12 12 12 3
$ %2 2792 " 61 #2 5 )
Var(X) = E(X 2) − E(X) = − = ≈ 0.417. σX = 5/12 ≈ 0.645.
3 2 12
If the month is selected from a non-leap year, then
1 4 7 365
E(X) = 28 · + 30 · + 31 · = = 30.417.
12 12 12 12
2. Let X be the number of women who will retire next year. X is a discrete random
variable with the set of possible values
4. Clearly,
4
P (X = 0) = F (0) = ;
33
16 4 12
P (X = 1) = F (1) − F (0) = − = ;
33 33 33
26 16 10
P (X = 2) = F (2) − F (1) = − = ;
33 33 33
30 26 4
P (X = 3) = F (3) − F (2) = − = .
33 33 33
To find P (X = 4), note that
P (X ≤ 3) + P (X = 4) + P (X > 4) = 1.
Thus
F (3) + P (X = 4) + P (X > 4) = 1.
This gives
30
+ 2 · P (X = 4) = 1.
33
3 1
Therefore, P (X = 4) = = .
66 22
5. The answer is
11 10 9 8 7a 7 6 9a 5
0· +a· + 2a · + 3a · + · + 4a · + · +
66 66 66 66 2 66 66 2 66
4 11a 3 2 13a 1 356 89
5a · + · + 6a · + · = a = a ≈ 2.697a.
66 2 66 66 2 66 132 33
F (t) = P (X ≤ t) = P (X = 1) = 1/n.
For 2 ≤ t < 3,
F (t) = P (X ≤ t) = P (X = 1) + P (X = 2) = 2/n.
Chapter 4 Solutions to Self-Test Problems 6
For 3 ≤ t < 4,
F (t) = P (X ≤ t) = P (X = 1) + P (X = 2) + P (X = 3) = 3/n,
n(n + 1)
7. Noting that 1 + 2 + · · · + n = , and noting that the number of seats in row
2
i is 10 + 2(i − 1), we have that the total number of seats in the auditorium is
25
! $ % 24 × 25
10 + 2(i − 1) = 250 + 2(1 + 2 + · · · + 24) = 250 + 2 · = 850.
2
i=1
Clearly, the set of possible values of X is {1, 2, . . ., 25}. Let p be the probability
mass function of X. Then
10 + 2(i − 1)
p(i) = P (X = i) = , i = 1, 2, . . ., 25.
850
8. Note that
P (X = 0) + P (X = 1) + P (X = 2) = 1 − P (X > 2) = 1 − (3/13) = 10/13.
P (X = 0) + P (X = 1) = P (X ≤ 1) = 7/13
P (X = 0) + P (X = 1) + P (X = 2) + P (X = 3) + P (X > 3) = 1.
9. Let X be the number of polo shirts sold by the department store, in a random year,
between April 1st and the end September. The probability mass function of X is
Therefore,
' !
129 (
1 21
E(X) = i· + 130 ·
51 51
i=100
1 2730
= (100 + 101 + 102 + · · · + 129) +
51 51
1 2730
= (100 + 100 + 1 + 100 + 2 + · · · + 100 + 29) +
51 51
1 2730
= (3000 + 1 + 2 + · · · + 29) +
51 51
1 . 29(29 + 1) / 2730 2055
= 3000 + + = .
51 2 51 17
The net profit is Y = 15X − 10(130 − X) = 25X − 1300. We are interested in
" 2055 #
E(Y ) = 25E(X) − 1300 = 25 − 1300 ≈ 1722.06.
17
Therefore, the expected net profit of the department store from selling polo shirts is
$1722.06.
CHAPTER 5
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 5.1
! "# $ # $ ! "# $ # $
6 1 6 3 0 6 1 5 3
1. 1 − − ≈ 0.995.
6 4 4 5 4 4
2. Let X be the number of side effects the patient experiences. The desired probability
is
P (X = 0) + P (X = 1) + P (X = 2)
! " ! "
7 7
= (1 − 0.05) +
7
(0.05)(1 − 0.05) +
6
(0.05)2(1 − 0.05)5 ≈ 0.996.
1 2
Section 5.2
1. Let X be the number of cars with exhaust leaks among the next 5,000 cars man-
ufactured by the company. The random variable X is approximately Poisson with
parameter λ = (0.003)(5, 000) = 15. The desired probability is
15
% e−15 · (15)i
P (12 ≤ X ≤ 15) = ≈ 0.383.
i!
i=12
2. Choosing one hour as the time unit, we have that λ = 3. Therefore, the desired
probability is
& ' & '
P N (1) = 2, N (4) = 12 = P N (1) = 2, N (4) − N (1) = 10
& ' & '
= P N (1) = 2 P N (4) − N (1) = 10
& ' & '
= P N (1) = 2 P N (3) = 10
Section 5.3
1. Let X be the number of experiments until the first success; X is a geometric random
variable with parameter p, and Z = min(X, m). The set of possible values for Z is
{1, 2, . . ., m}. Clearly,
(1 − p)i−1 p 1≤i<m
P (Z = i) =
(1 − p)m−1 i = m,
where i = m if and only if the first m − 1 trials are all failures.
Chapter 5 Solutions to Self-Quiz Problems 3
2. Call a month “success” if, in that month, at least one car is returned for warranty
work. We are interested in the fifth “success” occurring on the 8th month. Let X be
the number of months until the 5th “success”; X is negative binomial with parameters
5 and 0.45. The probability we are interested in is
! "
8−1
P (X = 8) = (0.45)5(1 − 0.45)3 ≈ 0.107.
5−1
Chapter 5 Solutions to Self-Test Problems 4
1. The set of possible values of the random variable X is {0, 1, . . ., 5}, and
! "! "
11 39
i 5−i
p(i) = P (X = i) = ! " i = 0, 1, . . ., 5.
50
5
The set of possible values of the random variable Y is {1, 2, . . ., 50}. By symmetry,
the third ball can have any of these possible values with equal probability. So
1
P (Y = i) = , i = 1, 2, . . . , 50.
50
2. Let X be the number of defective items in the sample. The random variable X is
binomial with parameters n = 200 and p = 0.006. The desired probability is
3 !
% "
200
P (X ≤ 3) = (0.006)i(1 − 0.006)200−i = 0.967.
i
i=0
4. Let X be the number of people who visit Ginny’s website tomorrow. Clearly, X is
binomial with parameters n = 10 7 and p = 5 × 10−8 . Since n is large, p is small,
and np is appreciable, X is approximately Poisson with parameter
1 1
= −1 + = .
1 − (1/4) 3
7. Let X be the number of individuals who perform at or above the company’s threshold
on a given year. Let n be the smallest integer for which P (X > n) < 0.01, or,
equivalently, P (X ≤ n) ≥ 0.99. The random variable X is binomial with probability
of success 0.08. We need to find the smallest n for which
%n ! "
27
P (X ≤ n) = (0.08)i(0.92)27−i ≥ 0.99.
i
i=0
The following table shows that the smallest n is 6. Therefore, the company should
fund 6 × 7, 000 = $42, 000 to make sure that the probability is less than 1% that the
amount is inadequate.
n 0 1 2 3 4 5 6
P (X ≤ n) 0.105 0.352 0.632 0.834 0.940 0.982 0.996
Chapter 5 Solutions to Self-Test Problems 6
8. Let A be the event that the tornado-force winds will not damage the tornado-proof
room during the-next 70 years. Since the events {X = i}, i = 0, 1, 2, . . . are mutually
exclusive and ∞ i=0 {X = i} is the sample space, by the law of total probability
(Theorem 3.4),
∞
% ∞
% e−21 · 21i
P (A) = P (A | X = i)P (X = i) = (1 − 0.063)i
i!
i=0 i=0
∞
% (19.677)i
= e−21 = e−21 · e19.677 = e−1.323 ≈ 0.266.
i!
i=0
9. Let A be the event that the die selected randomly is the loaded one. Let B be the event
that, when tossed 10 times, it lands on 6 exactly four times. The desired probability
is
P (B | A)P (A)
P (A | B) =
P (B | A)P (A) + P (B | Ac )P (Ac )
! "# $ # $
10 4 4 5 6 1
·
4 9 9 2
= ! " ! " ≈ 0.82.
10 # 4 $4 # 5 $6 1 10 # 1 $4 # 5 $6 1
· + ·
4 9 9 2 4 6 6 2
.%
∞
e−4/3 (4/3)n /
50, 000 (n − 1) ·
n!
n=1
.%
∞
e−4/3 (4/3)n % e−4/3 (4/3)n /
∞
= 50, 000 n· −
n! n!
n=1 n=1
. + , % ∞ −4/3
e (4/3)n /
= 50, 000 E N (1) − + e−4/3
n=0
n!
#4 $
= 50, 000 − 1 + e−4/3 ≈ 29, 846.52,
3
∞ −4/3
% e (4/3)n
where = 1, since it is the sum of all possible values of the proba-
n=0
n!
bility mass function of a Poisson random variable with parameter 4/3.
SOLUTIONS TO SELF-QUIZZES AND SELF-TESTS
CHAPTER 6
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 6.1
Section 6.2
1. Let G be the distribution function of Y , and g be its density function. Since tan x is
an increasing function in the interval (−π/2, π/2), for −∞ < t < ∞, we have
" 1 " 1
" "
Since y ∈ (0, ∞), " √ " = √ and we have
2 y 2 y
1
e−y/2 if y > 0
fY (y) = 2
0 otherwise.
Section 6.3
(ii) The probability that in a random week the profit of this store is less than $500
is ! 1/2 ! 1/2
3 1
P (X < 1/2) = f (x) dx = (x + x2 ) dx = .
−∞ 0 14 28
Since the profit from one week to another is independent, the probability that in 8
consecutive weeks the store makes less than $500 a week is (1/28) 8 ≈ 2.65 × 10−12 .
2. (i) Clearly,
! - .2 2
2
1 1 3
E(X) = x · (2 − x)3 dx = − x5 + x4 − x3 + x2 = = 0.4,
0 4 20 8 0 5
! -
2
1 1 3 3 2 .2 4
E(X 2) = x2 · (2 − x)3 dx = − x6 + x5 − x4 + x3 = ≈ 0.267.
0 4 24 10 4 3 0 15
Therefore,
/ 02 4 4 8
Var(X) = E(X 2) − E(X) = − = ≈ 0.107,
15 25 75
1
and σX = 8/75 ≈ 0.327. These imply that the expected value of X, the mean of
the claims made for partial losses is $400,000, and its standard deviation is approxi-
mately $327,000.
Chapter 6 Solutions to Self-Quiz Problems 4
and
g(t) = G! (t) = (ln 2)2tF ! (2t) = (ln 2)2tf (2t).
Recall from calculus that the derivative of the function 2 t is (ln 2)2t.
2. The probability that Kara hits the bull’s eye in one throw of a dart is
! 3
3 3 - 2 1 3 .3 3
x(10 − x) dx = 5x − x = · 36 ≈ 0.216.
0 500 500 3 0 500
Hence the probability that she misses the bull’s eye in 7 of her next 10 throws is
approximately 2 3
10
(0.216)3(1 − 0.216)7 ≈ 0.22.
3
3. Let X be the cost to repair a vehicle after the type of accident that is under consider-
ation. Let α = d/1000. We are given that P (X < 1 + α) = 0.0724. So
! 1+α
x2 /9 dx = 0.0724.
0
Therefore, - 1 .1+α
x3 = 0.0724,
27 0
or
√ (1 + α)3 /27 = 0.0724. This implies that (1 + α) 3 = 1.9548, or 1 + α =
3
1.9548 = 1.25. So α = 0.25 and hence the amount of deductible, d, is $250.
(ii) The distribution function of the spring of a randomly selected pendulum clock
with the probability density function f is
0 x<8
! t
14 14
F (t) = dx = 2 − 8 ≤ t < 15
8 (1 − x) 2 t −1
1 t ≥ 15.
By trial and error, we have that the left side is 0.0504 for α = 5.95 and 0.0494 for
α = 5.951. So the hardware store must carry at least 595.1 gallons of propane gas so
that its stock out probability for the gas is at most 0.05.
(b) Let X be the demand, in 100’s of gallons of propane gas, in the given week. We
have that ! 6
1 31
E(X) = x(6 − x2 )2 dx = ≈ 5.17.
875 1 6
So the expected value of the store’s inventory of propane gas is approximately 590-
517=73 gallons.
/ 02
6. Since Var(X) = E(X 2) − E(X) , first we calculate E(X). We have that
! π/2
1
E(X) = x cos x dx = 0,
−π/2 2
1 1
since g(x) = x cos x is an odd function; that is, g(−x) = (−x) cos(−x) =
2 2
1 1
− x cos x = −g(x). Now we calculate E(X ). Noting that g(x) = x2 cos x is an
2
2 2
1 1 2
even function; that is, g(−x) = (−x) cos(−x) = x cos x = g(x), we have
2
2 2
! π/2 ! π/2
1 2
E(X ) =
2
x cos x dx = x2 cos x dx
−π/2 2 0
- .π/2 π 2
= x2 sin x + 2x cos x − 2 sin x = − 2.
0 4
# π2 $
Therefore, Var(X) = − 2 − 02 ≈ 0.467.
4
Chapter 6 Solutions to Self-Test Problems 7
7. We have that
1
√ −1 < x < 1
f (x) = F (x) = π 1 − x
! 2
0 otherwise.
Therefore, ! 1
x
E(X) = √ dx = 0
−1 π 1 − x2
since the integrand is an odd function.
is the density of e X .
Chapter 6 Solutions to Self-Test Problems 8
2 2 2 2
= (e ln e2 − e2 ) − (e ln e − e) = e2 ,
3 3 3
4
where ln y dy is calculated by integration by parts (let u = ln y, dv = dy).
CHAPTER 7
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 7.1
1. Let X be a random number from (0, !). The probability of the desired event is
! !" ! ! !"
P min(X, ! − X) ≥ = P X ≥ , !−X ≥
3 3 3
2! !
!! −
2! " 3 3 1
=P ≤X≤ = = .
3 3 ! 3
2. The probability density function of X is
1/2 if −1 < x < 1
f (x) =
0 otherwise.
Now we find G, the distribution function of |X|. We have
0 t<0
t − (−t)
& '
G(t) = P |X| ≤ t = P (−t ≤ X ≤ t) = =t 0≤t<1
1 − (−1)
1 t ≥ 1.
This is the distribution function of a uniform random variable over the interval (0, 1).
Hence |X| is a random number between 0 and 1 and therefore,
& ' 1+0 1 & ' (1 − 0)2 1
E |X| = = and Var |X| = = .
2 2 12 12
Section 7.2
2. Let X be the number of light bulbs of type I. We want to calculate P (18 ≤ X ≤ 22).
Since the number of light bulbs is large and half of the light bulbs are type I, we
can assume that)X is approximately
√ binomial with parameters 40 and 1/2. Note that
np = 20 and np(1 − p) = 10. Using De Moivre-Laplace theorem and making
correction for continuity, we have
! 17.5 − 20 X − 20 22.5 − 20 "
P (17.5 ≤ X ≤ 22.5) = P √ ≤ √ ≤ √
10 10 10
= Φ(0.79) − Φ(−0.79) = 2Φ(0.79) − 1 = 0.5704.
As we see, up to at least 4 decimal places, this solution gives the same answer as
obtained above. This indicates the importance of correction for continuity; if it is
ignored, we obtain 0.4714, an answer which is almost 10% lower than the actual
answer.
Section 7.3
2. The time between two consecutive patients arriving for flu shots is exponential with
mean 1/λ = 1/3 hour. Since exponential is memoryless, knowing that the last patient
who took a flu shot arrived 45 minutes ago, does not change the distribution of X,
the time until the next patient arrives to have a flu shot. So X is still an exponential
random variable with parameter λ = 3 and the desired probability is
Section 7.4
1. Note that
(1/2)e−x/2 (x/2)3 (1/2)e−x/2(x/2)4−1
f (x) = = , x ≥ 0.
6 Γ(4)
Chapter 7 Solutions to Self-Quiz Problems 4
This shows that X is gamma with parameters r = 4 and λ = 1/2. So E(X) = r/λ =
√
8 and σX = r/λ = 2/(1/2) = 4. Therefore, the lifetime of a randomly selected
vacuum tube manufactured by this company is gamma with mean 8000 hours and
standard deviation 4000 hours.
Section 7.5
1. Let Y be a binomial random variable with parameters 10 and 0.12. Then by Theo-
rem 7.2,
+ ,
10
P (X ≥ 0.12) = P (Y < 1) = P (Y = 0) = (0.12)0(0.88)10 ≈ 0.28.
0
we have that
1 1
f (t) = t 2 −1 (1 − t)1−1 , 0 < t < 1,
2
is the density function of a beta random variable with parameters 1/2 and 1.
Section 7.6
1. Let X be the delivery lead time. Since the hazard function of X is constant, X
is an exponential random variable with parameter λ = 1/8. Since exponential is
memoryless, the desired probability is
- 2
1 −x/8
P (X < 2) = e dx = 1 − e−1/4 ≈ 0.22.
0 8
Chapter 7 Solutions to Self-Quiz Problems 5
2 3 2t 4
2
= exp 2 ln(1 − u)2
1−t 0
2 0 1
= exp 2 ln(1 − t)
1−t
2 0 1
= exp ln(1 − t)2
1−t
2
= · (1 − t)2 = 2(1 − t).
1−t
Therefore,
2(1 − t) 0<t<1
f (t) =
0 otherwise.
Chapter 7 Solutions to Self-Test Problems 6
1. Nolan will take bus route B26 if he arrives either between 1:15 P.M. and 1:20 P.M., ex-
cluding 1:15 P.M., or between 1:30 P.M. and 1:40 P.M., excluding 1:30 P.M. The problem
is equivalent to the following: A random number is selected from the interval [5, 42].
What is the probability that it belongs to (15, 20] ∪ (30, 40]. The answer is
20 − 15 40 − 30 15
+ = ≈ 0.41.
42 − 5 42 − 5 37
2. Let Y be a binomial random variable with parameter 6 + 10 − 1 = 15 and 6/14. By
Theorem 7.2,
P (X ≤ 6/14) = P (Y ≥ 6).
Therefore, the desired probability is
5 + ,!
* 15 6 "i ! 6 "15−i
P (X > 6/14) = P (Y < 6) = 1− ≈ 0.318.
i 14 14
i=0
3. Let X be the length of the residence of a family selected at random from this town.
Since ! 96 − 80 "
P (X ≥ 96) = P Z ≥ = 0.298,
30
using binomial distribution, the desired probability is
* 2 + ,
12
1− (0.298)i(1 − 0.298)12−i = 0.742.
i
i=0
- ∞
4. The probability that a sensor lasts larger than t units of time is λe−λx = e−λt .
t
The probability that exactly k sensors are working at time t is
+ ,
n
(e−λt )k (1 − e−λt )n−k .
k
5. (a) Let X be the waiting time of Kayla at the bus station. The desired probability is
P (X > 13, X > 9) P (X > 13)
P (X > 13 | X > 9) = =
P (X > 9) P (X > 9)
15 − 13
15 − 0 2 1
= = = ≈ 0.33.
15 − 9 6 3
15 − 0
(b) In this case, X is an exponential random variable with mean 15 (λ = 1/15). By
the memoryless property of the exponential, the desired probability is
& '
P (X > 4) = 1 − P (X ≤ 4) = 1 − 1 − e(−1/15)·4 = e−4/15 ≈ 0.77.
Chapter 7 Solutions to Self-Test Problems 7
6. Let X be the number of people who visit this website on the given day. Clearly,
X is binomial with parameters n = 10, 000,)000 and p = 0.002. Since E(X) =
(10, 000, 000)(0.002) = 20, 000 and σX = (10, 000, 000)(0.002)(1 − 0.002) =
141.28, by correction for continuity and De Moivre-Laplace Theorem,
! X − 20, 000 20, 199.5 − 20, 000 "
P (X ≥ 20, 200) ≈ P (X ≥ 20, 199.5) = P ≥
141.28 141.28
! X − 20, 000 "
=P ≥ 1.41 ≈ P (Z ≥ 1.41)
141.28
= 1 − Φ(1.41) ≈ 1 − 0.9207 = 0.0793.
8. (a) Clearly,
- ∞
F̄ (t) = P (X > t) = (2e−2x + 4e−4x − 6e−6x ) dx = e−2t + e−4t − 6e−6t .
t
This yields
f (t) 2e−2t + 4e−4t − 6e−6t
λ(t) = = −2t , t ≥ 0.
F̄ (t) e + e−4t − 6e−6t
(b) The instantaneous failure rate of the system, if it has survived 6,000 hours, is
λ(0.6) ≈ 2.198. An approximate probability for the system to fail within 100 hours
if it has survived 6000 hours is (1/100) · λ(0.6) ≈ 0.02198.
9. Let X be the number of voters in the random sample who voted for Tasoula. The
random variable X is)binomial with parameters n = 100 and p = 1000/5000 = 1/5.
Since np = 20 and np(1 − p) = 4, using correction for continuity, the desired
probability, P (X > 15), can be approximated by the De Moivre-Laplace theorem as
follows.
Chapter 7 Solutions to Self-Test Problems 8
! X − 20 15.5 − 20 "
P (X > 15) = P (X ≥ 16) = P (X > 15.5) = P >
4 4
! X − 20 " ! X − 20 "
=P > −1.13 = P − 1.13 < <∞
4 4
- ∞
1 2
≈√ e−x /2 dx
2π −1.13
- ∞ - −1.13
1 −x2 /2 1 2
=√ e dx − √ e−x /2 dx
2π −∞ 2π −∞
0 1
= 1 − Φ(−1.13) = 1 − 1 − Φ(1.13) = Φ(1.13) ≈ .8708,
10. Suppose that the lifetime of a chip manufactured by the company’s own plant is
√
gamma with parameters r1 and λ1 . We are given that r1 /λ1 = 4 and r1 /λ1 = 2.
These two equations imply that r 1 = 4, and λ1 = 1. Thus the probability density
function of the lifetime of a chip manufactured by the company is
e−x · x3 1
f (x) = = x3 e−x , x ≥ 0.
Γ(4) 6
Suppose that the lifetime of a chip purchased from the supplier is gamma with pa-
√
rameters r2 and λ2 . We are given that r2 /λ2 = 6 and r2 /λ2 = 6. These two
equations imply that r 2 = 1, and λ2 = 1/6. Thus the probability density function of
the lifetime of a chip purchased from the supplier is
(1/6)e−x/6 (x/6)0 1
g(x) = = e−x/6 , x ≥ 0.
Γ(1) 6
Now, let X be the lifetime of the chip of a randomly selected cell phone manufactured
by the wireless phone company. Let A be the event that the chip was manufactured by
the company’s own plant. Let B be the event that it was purchased from the supplier.
The probability that the chip of the cell phone fails before the warranty period is
Therefore, the company must replace 9.8% of the cell phones it sells within a year
due to chip failure warranty.
SOLUTIONS TO SELF-QUIZZES AND SELF-TESTS
CHAPTER 8
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 8.1
2. (a) By definition,
! 5
1 1 1
fY (y) = (2x + 3y) dx = y + , 1 < y < 7.
1 432 36 18
(b) We are asked to find E(X + Y ), which is calculated as follows
! 5 *! 7 +
1
E(X + Y ) = (x + y)(2x + 3y) dy dx
432 1 1
! 5
1 23
= (12x2 + 120x + 344) dx = ≈ 7.66666667.
432 1 3
Therefore, the expected value of the sum of the annual losses under medical coverage
and under comprehensive and collision is $7,666,666.67.
Section 8.2
1. If f (x, y) = fX (x)fY (y), then X and Y are independent. Otherwise, they are not.
Note that ! 2
1 3 −2x
fX (x) = y e dy = 2e−2x , x ≥ 0;
0 2
! ∞
1 3 −2x 1
fY (y) = y e dx = y 3 , 0 ≤ y ≤ 2.
0 2 4
Since f (x, y) = fX (x)fY (y), X and Y are independent.
Chapter 8 Solutions to Self-Quiz Problems 3
2. Since X and Y are independent, the joint probability density function of X and Y is
λe−λx · µe−µy = λµe−λx e−µy x > 0, y > 0
f (x, y) =
0 otherwise.
Section 8.3
where 0 ≤ x ≤ 5, 0 ≤ y ≤ 7, and 4 ≤ x + y ≤ 8.
p(x, y)
Now pX|Y (x|y) = , where
pY (y)
* +* +
7 9
y 8−y
pY (y) = P (Y = y) = * + , 0 ≤ y ≤ 8.
16
8
Hence * +* +
5 4
x 8−x−y
pX|Y (x|y) = * + ,
9
8−y
where 0 ≤ x ≤ 5, 0 ≤ y ≤ 7, and 4 ≤ x + y ≤ 8.
2. By definition,
f (x, y)
fX|Y (x|y) = ,
fY (y)
where,
! 1 ! 1, -
fY (y) = f (x, y) dx = 1 + α(1 − 2x)(1 − 2y) dx = 1, 0 < y < 1.
0 0
Note that, as expected, fY is a uniform random variable over the interval (0, 1). Thus
f (x, y)
fX|Y (x|y) = = 1 + α(1 − 2x)(1 − 2y), 0 < x < 1, 0 < y < 1.
fY (y)
Therefore, for 0 < y < 1,
! 1 ! 1 , - 1
E(X | Y = y) = xfX|Y (x|y) dx = x 1+α(1−2x)(1−2y) dx = (2αy−α+3).
0 0 6
Section 8.4
1. Let f (x, y) be the joint probability density function of X and Y . We are given that
1 0<x<1
fX (x) =
0 otherwise.
1 0<y<1
fY (y) =
0 otherwise.
Since X and Y are independent,
1 if 0 < x < 1 and 0 < y < 1
f (x, y) = fX (x)fY (y) =
0 otherwise.
Consider the system of two equations in two unknowns
.
x+y = u
x − y = v.
This system has the unique solution
u+v
x = 2
y = u − v ,
2
/ 0
and it defines a one-to-one
/ transformation of R = (x, y) : 0 <
0 x < 1, 0 < y < 1
onto the region Q = (u, v) : 0 < u + v < 2, 0 < u − v < 2 , where Q is sketched
in Figure 8a.
Chapter 8 Solutions to Self-Quiz Problems 5
Now
1 1
1 1/2 1/2 1
1 1
J =1 1 = −1/2 %= 0.
1 1/2 −1/2 1
Therefore, by Theorem 8.8, g(u, v), the joint probability density function of U and
V , is given by
1/2 0 < u + v < 2, 0 < u − v < 2
g(u, v) = f (u, v)|J| =
0 otherwise.
1/2 0 < u < 1, −u < v < u
= 1/2 1 < u < 2, u − 2 < v < 2 − u
0 otherwise.
2. Since
e−x · 2e−2y x ≥ 0, y ≥ 0
f (x, y) =
0 otherwise,
and fX (x) = e−x for x ≥ 0, and fY (y) = 2e−2y for y ≥ 0, we have that X and Y
are independent random variables. To find h, we use the convolution theorem. Since
2e−2(t−x) if 0 ≤ x ≤ t
fY (t − x) =
0 otherwise,
! t ! t
h(t) = 2e−2(t−x) · e−x dx = 2e−2t · e−x dx = 2(e−t − e−2t ), t ≥ 0.
0 0
Chapter 8 Solutions to Self-Test Problems 6
1. As Figure 8b shows, the point is selected from the disk centered at the origin with
radius 2. The probability that it falls inside E, the disk centered at the origin with
radius 1 is
area (E) π(1)2 1
= = .
area (S) π(2) 2 4
Similarly, the probability that it falls inside the square F is
√ √
area (F ) 2 2·2 2 2
= = .
area (S) π(2) 2 π
Note
√ that, by √
the Pythagorean Theorem, the length of each side of the square is
2 + 2 = 2 2.
2 2
2. We will show that E(XY ) %= E(X)E(Y ). That implies that X and Y are not
independent. We have
! 1 $ ! 1−x % ! 1 $! 1−x %
E(XY ) = 60 x y dy dx = 60
3 2 3
x y 2 dy dx
0 0 0 0
! 1
1
= 20 x3 (1 − x)3 dx = ;
0 7
Chapter 8 Solutions to Self-Test Problems 7
! 1 $ ! 1−x % ! 1 $! 1−x %
E(X) = 60 x y dy dx = 60
3
x 3
y dy dx
0 0 0 0
! 1
1
= 30 x3 (1 − x)2 dx = ;
0 2
! 1 $ ! 1−x % ! 1 $! 1−x %
E(Y ) = 60 x y dy dx = 60
2 2
x 2
y 2 dy dx
0 0 0 0
! 1
1
= 20 x2 (1 − x)3 dx = .
0 3
We have established that
1 1 1 1
= E(XY ) %= E(X)E(Y ) = · = .
7 2 3 6
So X and Y are not independent.
4. Let g(x, y) be the joint probability density function of X and Y . Let g X (x) and
gY (y) be the marginal probability density functions of X and Y , respectively. Since
X and Y are independent, g(x, y) = g X (x)gY (y). Since
8
2 4≤x≤8
gX (x) = x
0 otherwise,
Chapter 8 Solutions to Self-Test Problems 8
8
2 4≤y≤8
y
gY (y) =
0 otherwise.
We have that 64
2 2 4 ≤ x ≤ 8, 4 ≤ y ≤ 8
x y
g(x, y) =
0 otherwise.
Hence the desired probability, P (X + Y ≥ 13), is the double integral of g(x, y) over
the shaded area of Figure 8c.
Figure 8c The shaded area is the region of integration of Self-Test Problem #4.
So
! 8$! 8 %
64
P (X + Y ≥ 13) = dy dx
5 13−x x2 y 2
! 8
64 $ 1 1%
= − dx ≈ 0.125.
5 x2 13 − x 8
Chapter 8 Solutions to Self-Test Problems 9
5. (a) We know that f (x, y), the joint probability density function of X and Y is given
by
1
(x, y) ∈ D
area(D)
f (x, y) =
0 elsewhere.
Therefore,
1
x2 + y 2 ≤ 1
f (x, y) = π
0 elsewhere.
f (x, y)
By definition, fX|Y (x|y) = , where, as Figure 8d shows,
fY (y)
! √1−y2
22
fY (y) = √ f (x, y) dx = 1 − y2 , −1 ≤ y ≤ 1.
− 1−y2 π
Therefore,
1/π 1 2 2
fX|Y (x|y) = 2 = 2 , − 1 − y2 ≤ x ≤ 1 − y2.
(2/π) 1 − y 2 2 1 − y2
As a function of x, fX|Y (x|y) is a constant, and this is the probability density function
3 2 2 4
of a uniform random variable over the interval − 1 − y 2 , 1 − y 2 .
Chapter 8 Solutions to Self-Test Problems 10
Thus $
1 1 3 2%
t −t+ 2<t<4
16 6 3
$ %
h(t) = 1 1 3
− t + 9t − 18 4≤t<6
16 6
0 elsewhere.
Chapter 8 Solutions to Self-Test Problems 11
Since X and Y are independent, the joint probability density function of X and Y is
given by
λe−λx · µe−µy = λµe−λx e−µy x > 0, y > 0
f (x, y) =
0 otherwise.
Therefore, F (t) is the double integral of f (x, y) over the shaded area of Figure 8f
between the line x − y = t and x − y = −t. Thus
P (X − t ≤ Y ≤ X + t)
! t * ! x+t + ! ∞ * ! x+t +
= λµe−λx −µy
e dy dx + λµe−λx −µy
e dy dx
0 0 t x−t
! t *! x+t + ! ∞ *! x+t +
−λx −µy −λx −µy
= λe µe dy dx + λe µe dy dx
0 0 t x−t
! t ! ∞
3 4 3 4
= −λx
λe 1−e −µt−µx
dx + e µt
−e −µt
λe−λxe−µx dx
0 t
Chapter 8 Solutions to Self-Test Problems 12
3 4 $ λ λ −λt−µt % 3 µt 4 λ −λt−µt
= 1 − e−λt − e−µt − e + e − e−µt e
λ+µ λ+µ λ+µ
λ −µt µ −λt
= 1− e − e .
λ+µ λ+µ
-t
8. Let f (x, y) be the joint probability density function of X and Y . We are given that
2e−2y y>0
fY (y) =
0 otherwise
and
ye−yx x ≥ 0, y > 0
fX|Y (x|y) =
0 otherwise.
Therefore,
2ye−(x+2)y x ≥ 0, y > 0
f (x, y) = fX|Y (x|y)fY (y) =
0 otherwise.
We are interested in
f (x, y)
fY |X (y|x) = ,
fX (x)
where
! ∞ ! ∞
fX (x) = f (x, y) dy = 2ye−(x+2)y dy
0 0
5 2 2 6∞ 2
= − ye−(x+2)y − e−(x+2)y
= , x ≥ 0.
x+2 (x + 2)2 0 (x + 2)2
Thus
2ye−(x+2)y
fY |X (y|x) = 7 = (x + 2)2 ye−(x+2)y
2 (x + 2)2
, -2−1
(x + 2)e−(x+2)y · (x + 2)y
= .
Γ(2)
Therefore, the probability density function of the rate at which the emergency room
serves the non-life threatening patients, when the waiting time is the constant x, is
gamma with parameters
√ (2, x + 2). Its expected value and standard deviation are
2/(x + 2) and 2/(x + 2), respectively.
9. Let V = X. We will first find the joint probability density function of U and V ,
and then we calculate the marginal probability density function of U . To apply The-
orem 8.8, let h1 (x, y) = x/(x + y) and h2 (x, y) = x. The system equations
x
x + y = u
x=v
Chapter 8 Solutions to Self-Test Problems 14
y = v − v,
u
and
1 1
1 0 11
1 1
J = 11 1 = v %= 0.
v 1 1 2
1 − 2 −1 1 u
u u
Now x > 0, y > 0, u = x/(x + y), and v = x imply that v > 0 and 0 < u < 1.
Hence, by Theorem 8.8, g(u, v), the joint probability density function of U and V ,
is given by
4v
g(u, v) = 2 e−2v/u , 0 < u < 1, v > 0.
u
Hence
! ∞
4 4 u2 5 2 −2v/u −2v/u 6∞
gU (u) = 2 ve−2v/u dv = 2 · − ve −e = 1, 0 < u < 1.
u 0 u 4 u 0
This shows that U is a uniform random variable over the interval (0, 1).
10. Let f (x, y) be the joint probability density function of X and Y . We are given that
1 if 0 < x < 1
fX (x) =
0 elsewhere.
1 if 0 < y < 1
fY (y) =
0 elsewhere.
Since X and Y are independent,
1 if 0 < x < 1 and 0 < y < 1
f (x, y) = fX (x)fY (y) =
0 elsewhere.
Now the system of equation
x + y = u
x
=v
x+y
has the unique solution x = uv and y = u − uv. We have that u > 0, v > 0, and
1 − v > 0. That is, u > 0, v > 0, and v < 1. Also
1
0<x<1 ⇐⇒ 0 < uv < 1 ⇐⇒ 0<u< ,
v
1
0<y<1 ⇐⇒ 0 < u(1 − v) < 1 ⇐⇒ 0<u< .
1−v
Chapter 8 Solutions to Self-Test Problems 15
Now, if 0 < v < 1/2, then 1/(1 − v) < 1/v, and if 1/2 < v < 1, then 0 < u < 1/v.
Thus the system of two equations in two unknowns
x + y = u
x
=v
x+y
/ 0
defines a one-to-one correspondence from R = (x, y) : 0 < x < 1 and 0 < y < 1
to
8 1 19 : 8 1 1 9
Q = (u, v) : 0 < u < , 0<v< (u, v) : 0 < u < , < v < 1 .
1−v 2 v 2
Since x = uv and y = u − uv, we have
1 1
1 v u 11
1
J =1 1 = −vu − u + vu = −u %= 0.
1 1 − v −u 1
Hence, by Theorem 8.8, g(u, v), the joint probability density function of U and V , is
given by
g(u, v) = f (u, v)|J| = u, (u, v) ∈ Q.
(b) For 0 < v < 1/2,
! 1/(1−v)
1
gV (v) = u du = .
0 2(1 − v)2
CHAPTER 9
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 9.1
!
1. Let K = (x, y, z, u, v) : 0 ≤ x ≤ 15," 0 ≤ y ≤ 10, 0 ≤ z ≤ 6, 0 ≤ u ≤ 4,
0 ≤ v ≤ 5, x + y + z + u + v = 20 . p(x, y, z, u, v), the joint probability mass
function of X, Y , Z, U , and V is given by
# $# $# $# $# $
15 10 6 4 5
x y z u v
p(x, y, z, u, v) = # $ , (x, y, z, u, v) ∈ K.
40
20
The marginal probability mass function of U and V is given by
# $# $# $
4 5 31
u v 20 − u − v
pU,V (u, v) = # $ , 0 ≤ u ≤ 4, 0 ≤ v ≤ 5.
40
20
2. We have
% ∞&% ∞'% ∞ ( )
P (X < Y < Z) = 2y e−(2x+y+z) dz dy dx
0 x y
% ∞'% ∞ (
= 2y e−2x−2y dy dx
0 x
%
1 ∞
3
= (2x + 1) e−4x dx = ≈ 0.19.
2 0 16
Section 9.2
where,
1 0<x<1
f (x) =
0 otherwise.
Chapter 9 Solutions to Self-Quiz Problems 3
So
2 0 < x1 < x2 < 1
f12 (x1 , x2 ) =
0 otherwise.
The desired expected value is calculated as follows
% 1% 1
* +
E X(2) − X(1) = (x2 − x1 ) · 2 dx2 dx1
0 x1
% &1 )1
1
1
= (x21 − 2x1 + 1) dx1 = x31 − x21 + x1 = .
0 3 0 3
Therefore, by Theorem 9.5, the probability density functions of X (1) and X(5) are
calculated as follow.
5! * +0 * +5−1
f1 (x) = f (x) F (x) 1 − F (x)
0! 4!
* +4 5' x (4
= 5f (x) 1 − F (x) = 1− , 0 < x < 12,
12 12
and f1 (x) = 0, otherwise.
5! * +4 * +5−5
f5 (x) = f (x) F (x) 1 − F (x)
4! 0!
* +4 5 ' x (4
= 5f (x) F (x) = , 0 < x < 12,
12 12
and f5 (x) = 0, otherwise. The probability that the winning horse passed the finish
line between 2:10 P. M . and 2:11 P. M . is
% 1 '
5 x (4 87, 781
1− dx = ≈ 0.35.
0 12 12 248, 832
Chapter 9 Solutions to Self-Quiz Problems 4
The probability that the time that the last horse crossed the finish line was after
2:21 P. M . is % 12 ' (
5 x 4 87, 781
dx = ≈ 0.35.
11 12 12 248, 832
Section 9.3
2. Let X1 , X2, and X3 be the number of personal, work-related, and unsolicited com-
mercial emails among the 25 emails in Samanthas inbox. The probability we are
asked to calculate is
4
0
P (X1 = 15 − i, X2 = 10, X3 = i)
i=0
4
0 25!
= (0.45)15−i(0.40)10(0.15)i ≈ 0.11.
(15 − i)! 10! i!
i=0
Chapter 9 Solutions to Self-Test Problems 5
4 3
1. (a) The volume of a sphere with radius R is πR . Since the volume of the unit
3
4
sphere is π, by (9.10),
3
3
if x2 + y 2 + z 2 ≤ 1
f (x, y, z) = 4π
0 otherwise.
% √1−x2 −y2
3 3 1
(b) fX,Y (x, y) = √ dz = 1 − x2 − y 2 , x2 + y 2 ≤ 1.
− 1−x2 −y2 4π 2π
% % √
1−x2 −z2
1
3
(c) fZ (z) = dy dx. However, it is much easier if we calculate
−1
√
− 1−x2 −z2 4π
this
f (x1 , x2 , . . ., xn ) = fX1 (x1 )fX2 |X1 (x2 |x1 )fX3 |X1 ,X2 (x3 |x1 , x2 )
· · · fXn |X1 ,X2 ,...,Xn−1 (xn |x1 , x2 , . . ., xn−1 ).
Now
% ∞% ∞% ∞% ∞
1
dt dz dy dx
0 0 0 0 (1 + x + y + z + t)6
% ∞% ∞% ∞
1
= dz dy dx
0 0 0 5(1 + x + y + z)5
% ∞% ∞
1
= dy dx
0 0 20(1 + x + y)4
% ∞
1 1
= dx = .
0 60(1 + x) 3 120
Therefore, c · (1/120) = 1 implies that c = 120. Now we calculate
P (X < Y < Z < T ).
We have
% ∞% ∞% ∞% ∞
1
P (X < Y < Z < T ) = 120 dt dz dy dx
0 x y z (1 + x + y + z + t)6
% ∞% ∞% ∞
1
== 24 dz dy dx
0 x y (1 + x + y + 2z)5
% ∞% ∞
1
=3 dy dx
0 x (1 + x + 3y)4
%
1 ∞
1 1
= dx = ≈ 0.0417.
3 0 (1 + 4x) 3 24
4. (a) Let X1, X2, . . . , X7 be the lifetimes of the components of the system. For
1 ≤ i ≤ 7, f , the probability density function of X i , and F , its distribution function
are given by
f (x) = e−x , x ≥ 0;
F (x) = 1 − e−x x ≥ 0.
Note that the system is still functional two years from now if and only if X (7) > 2.
By Remark 9.2, the probability density function of X (7) is given by
(b) At least three components of the system are functional two years from now if
and only if X (5) > 2. By Theorem 9.5, the probability density function of X (5) is
7! −x * +2
f5 (x) = e (1 − e−x )4 1 − (1 − e−x ) = 105 e−3x(1 − e−x )4 , x ≥ 0.
4! 2!
Chapter 9 Solutions to Self-Test Problems 7
Therefore,
% ∞
P (X(5) > 2) = 105 e−3x (1 − e−x )4 dx ≈ 0.057.
2
5. Let X1, X2, and X3 be, respectively, the number of physicians among the group who
retired at ages 64 and below, 65-69, and 70 or later. Let p(x 1 , x2 , x3 ) be the joint
probability mass function of X 1 , X2 , and X3 . The median age of the five randomly
selected retired physicians is below 65 if and only if the ages of at least three of them
are below 65. Therefore, the desired probability is
CHAPTER 10
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 10.1
1. Let X be the number of disk storage wallets that Shiante should search to find the
DVD. For i = 1, 2, . . ., 11, let Xi = 1 if the DVD is stored in the ith wallet that
Shiante searches, and Xi = 0, otherwise. Then
X = 1 · X1 + 2 · X2 + · · · + 11 · X11 ,
and, therefore,
However,
E(Y 2 ) = E(Y ) = P (X > 0).
Thus "
E(X) ≤ E(X 2)P (X > 0).
This implies that
# $2
E(X 2)P (X > 0) ≥ E(X) .
Hence # $2
E(X)
P (X > 0) ≥ ,
E(X 2)
or #
$2
E(X)
1 − P (X > 0) ≤ 1 − ,
E(X 2)
which establishes the relation
# $2
E(X 2) − E(X) Var(X)
P (X = 0) ≤ = .
E(X )2 E(X 2)
Chapter 10 Solutions to Self-Quiz Problems 3
Section 10.2
1. Clearly, E(Xi) = E(Xj ) = 1/4. For 1 ≤ i ≤ 8, let Ai be the event that the ith card
drawn is a heart. Then
1 if Ai Aj occurs
Xi Xj =
0 otherwise.
Therefore,
12 1 3
E(XiXj ) = P (Ai Aj ) = P (Aj | Ai )P (Ai ) = · = .
51 4 51
Hence
3 1 1
Cov(Xi, Xj ) = E(XiXj ) − E(Xi)E(Xj ) = − =− .
51 16 272
2. Suppose that X and Y are the lifetimes of two devices each with mean 1/λ. V is the
time when the first device dies; U − V is the additional lifetime of the other device,
which, by the memoryless property of the exponential, is itself an exponential random
variable with mean 1/λ independently of V . So, by independence of U − V and V ,
Cov(U − V, V ) = 0. Hence
Note that
( )
P (V > t) = P min(X, Y ) > t
implies that V is exponential with mean 1/(2λ) and variance 1/(4λ 2). Therefore,
Cov(U, V ) = Var(V ) = 1/(4λ2).
Section 10.3
Cov(X, Y ) 19
ρ(X, Y ) = = > 1.
σX σY 18
2. Note that
n
* **
Var(X1 + X2 + · · · + Xn ) = Var(Xi ) + 2 Cov(Xi, Xj )
i=1 i<j
n−1 *
* n
= 4n + 2 Cov(Xi , Xj ).
i=1 j=i+1
Now
Cov(Xi , Xj ) Cov(Xi , Xj ) 1
ρ(Xi, Xj ) = = =−
σXi σXj 4 16
+n−1 +n
implies that Cov(X i , Xj ) = −1/4. Since i=1 j=i+1 Cov(Xi , Xj ) has
(n − 1)n
(n − 1) + (n − 2) + · · · + 1 =
2
identical terms, we have
, 1 - n(17 − n)
Var(X1 + X2 + · · · + Xn ) = 4n + (n − 1)n − = .
4 4
Section 10.4
1. Due to the memoryless property of exponential random variables, it does not matter
who cried last or how many times before. Let X be the time until Hannah cries next
and Y be the time until Joshua cries next; X and Y are independent exponential ran-
dom variables with parameters 10 and 7, respectively. By Remark 10.2, the desired
probability is
7 7
P (Y < X) = = .
7 + 10 17
2. Let midnight be labeled t = 0. It should be clear that the percentage of the cars
parked illegally in the time interval (0, 30) and caught is equal to the percentage of
all cars parked illegally and caught. Between 0 and 30, let Y be the time a random
car is illegally parked on the street; Y is a uniform random variable over (0, 30).
Suppose that the car parked illegally at Y is left parked for X units of time. Let A be
the event that it is caught. The desired percentage is 100 · P (A).
. 30 . 30
1 1
P (A) = P (A | Y = y) dy = P (X > 30 − y) dy
0 30 0 30
. 30
1
= e−(1/10)(30−y) · dy = 0.317.
0 30
Therefore, the percentage of the cars that are parked illegally and caught is 31.7%.
Note that, as expected, the final answer does not depend on λ.
Chapter 10 Solutions to Self-Quiz Problems 5
Section 10.5
1. Let X be the period between two consecutive calls made to extension 1247; X is an
exponential random variable with mean 1/λ. We have
. ∞
( ) ( )
P (N = i) = P N2 (X) = i = P N2 (X) = i | X = x λe−λx dx
0
. .
∞ ( ) ∞
e−µx (µx)i −λx
= P N2 (x) = i λe−λx dx = λe dx
0 0 i!
. ∞
µi λ
= xi e−(λ+µ)x dx.
i! 0
2. Let X be the annual calamity loss of the business in a random year. Let Y be the
annual calamity loss of the business that year paid by the insurance company. We are
interested in E(Y ):
. 0.4 . ∞
# $
E(Y ) = E E(Y | X) = E(Y | X = x)f (x) dx + E(Y | X = x)f (x) dx
0.25 0.4
. 0.4 . ∞
= xf (x) dx + (0.4)f (x) dx
0.25 0.4
. .
1
0.4 ∞
1
= x· dx + (0.4) dx
0.25 64x5 0.4 64x5
≈ 0.251953 + 0.061035 = 0.312988.
Therefore, the expected value of the annual loss of the business paid by the insurance
company is $312,988.
3. Note that
n
* **
Var(X1 + X2 + · · · + Xn ) = Var(Xi ) + 2 Cov(Xi, Xj ),
i=1 i<j
Chapter 10 Solutions to Self-Test Problems 7
** n−1
* n
*
where Cov(Xi, Xj ) = Cov(Xi, Xj ) has
i<j i=1 j=i+1
(n − 1)n
(n − 1) + (n − 2) + · · · + 1 =
2
terms, and Cov(Xi , Xj ) = ρ(Xi, Xj )σXi σXj = aσ 2 . Hence
Var(X1 + X2 + · · · + Xn ) ≥ 0
yields
nσ 2 + (n − 1)naσ 2 ≥ 0,
1
which implies that 1 + (n − 1)a ≥ 0 or a ≥ − .
n−1
4. Clearly X is exponential with parameter λ. Let Y 1 , Y2 , and Y3 be the lifetimes of
the electron guns of Dan’s monitor. Then Y 1 , Y2 , and Y3 are independent exponential
random variables each with parameter λ. Hence, Y = min(Y1 , Y2 , Y3 ) is exponential
with parameter 3λ. This gives
λ 1
P (X < Y ) = = ;
λ + 3λ 4
# $ 1
E min(X, Y ) = .
4λ
# $
To find E max(X, Y ) , note that, by Remark 6.4,
. ∞
# $ ( )
E max(X, Y ) = P max(X, Y ) > t dt
0
. ∞1 ( )2
= 1 − P max(X, Y ) ≤ t dt
0
. ∞# $
= 1 − P (X ≤ t, Y ≤ t) dt
0
. ∞# $
= 1 − P (X ≤ t)P (Y ≤ t) dt
0
. ∞# $
= 1 − (1 − e−λt )(1 − e−3λt) dt
0
1 1 1 −3λt 1 −4λt 2∞ 13
= − e−λt − e + e = λ.
λ 3λ 4λ 0 12
5. Let X be the time until the seismograph placed in the New Mexico desert is replaced.
Let L be its lifetime. L is an exponential random variable with parameter λ = 1/7.
Let
1 if L ≥ 5
Y =
0 if L < 5.
Chapter 10 Solutions to Self-Test Problems 8
1 1 25
−t/7
= − (7t + 49)e ≈ 2.205.
7(1 − e−5/7 ) 0
6. Region R is the shaded area of Figure 10a. Since the area of R is 9π − 4π = 5π, the
joint probability density function of X and Y is given by
1
if (x, y) ∈ R
f (x, y) = 5π
0 otherwise.
Since P (0 < X < 1 | Y > 0) '= P (0 < X < 1), X and Y are dependent. To show
that X and Y are uncorrelated, we need to calculate E(X), E(Y ), and E(XY ). To
calculate ..
E(X) = f (x, y) dx dy,
R
we convert from Cartesian to polar coordinates. We have x = r cos θ, y = r sin θ,
and dx dy = r dr dθ. So
. 2π . 3 . 2π / . 3 0
1 1
E(X) = (r cos θ) · r dr dθ = (r cos θ) r dr dθ
2
0 2 5π 5π 0 2
. 2π
19 19 1 22π
= cos θ dθ = sin θ = 0.
15π 0 15π 0
Chapter 10 Solutions to Self-Test Problems 10
y
x =1
Similarly E(Y ) = 0. The fact that E(X) = E(Y ) = 0 must be evident since X and
Y assume positive and negative values with the same probabilities. Now we calculate
E(XY ). We have
. .
2π 3
1
E(XY ) = (r cos θ)(r sin θ) ·
r dr dθ
0 2 5π
. 2π /. 3 0
1
= sin θ cos θ 3
r dr dθ
5π 0 2
. .
13 2π 13 2π , 1 -
= sin θ cos θ dθ = sin 2θ dθ
4π 0 4π 0 2
13 1 1 22π 13 , 1 1-
= − cos 2θ = − + = 0.
8π 2 0 8π 2 2
Hence
Cov(X, Y ) = E(XY ) − E(X)E(Y ) = 0 − 0 · 0 = 0.
This shows that X and Y are uncorrelated.
7. Let N (t) be the number of claims received by the insurance company at or prior to
time t. Let4 Λ be the average
5 number of claims during a one month period. We are
given that N (t) : t ≥ 0 is a Poisson process with parameter Λ, where Λ is a gamma
Chapter 10 Solutions to Self-Test Problems 11
random variable with parameters n and λ. To calculate the distribution of N (12), the
desired random variable, we will condition on Λ.
. ∞
( ) ( ) λe−λx (λx)n−1
P N (12) = i = P N (12) = i | Λ = x dx
0 (n − 1)!
. ∞ −12x
e (12x)i λe−λx (λx)n−1
= · dx
0 i! (n − 1)!
. ∞
12iλn # $
= e−(λ+12)x xn+i−1 dx Now let y = (λ + 12)x.
i! (n − i)! 0
. ∞
12i λn 1
= · e−y y n+i−1 dy
i! (n − 1)! (λ + 12)n+i 0
12i λn 1
= · Γ(n + i)
i! (n − 1)! (λ + 12)n+i
, 12 -i , λ -n (n + i − 1)!
=
λ + 12 λ + 12 i! (n − 1)!
/ 0, -
n+i−1 λ n , 12 -i
= , i = 0, 1, 2, . . ..
n−1 λ + 12 λ + 12
For i ≥ 0, letting j = n + i, we have shown that
/ 0
( ) j − 1 , λ -n , 12 -j−n
P n+N (12) = j = , j = n, n+1, n+2, . . ..
n − 1 λ + 12 λ + 12
6
This shows that n + N (12) is negative binomial with parameter n and λ (λ + 12).
* 11
10 * 12
* 10 11 12
1 1 * * *
= k· / 0= k.
12 220
i=1 j=i+1 k=j+1 i=1 j=i+1 k=j+1
3
Now
12
* 12
* j
* 12 × 13 j(j + 1) 1
k= k− k= − = (156 − j 2 − j).
2 2 2
k=j+1 k=1 k=1
Chapter 10 Solutions to Self-Test Problems 12
So
11
* 12
* 11
1 *
k= (156 − j 2 − j)
2
j=i+1 k=j+1 j=i+1
7 11 i 8
1 * *
= (156 − j 2 − j) − (156 − j 2 − j)
2
j=1 j=1
7
1 11 × 12 × 23 11 × 12
= 1716 − − − 156i
2 6 2
8
i(i + 1)(2i + 1) i(i + 1)
+ +
6 2
i(i + 1)(i + 2)
= 572 − 78i + .
6
Therefore,
1 *1 i(i + 1)(i + 2) 2
10
E(X) = 572 − 78i +
220 6
i=1
1 * 1 i(i + 1)(i + 2) 2
10
78 10 × 11
= 26 − · +
220 2 220 6
i=1
39 1 39
= 26 − + · 4290 = = 9.75.
2 1320 4
9. Let X1 = 1 for counting the first person that Amber will meet. Let X 2 be the number
of people Amber will meet until she finds a person whose birthday is different from
the birthday of the first person she met. Clearly, X 2 is a geometric random variable
with probability of success p given by p = 364/365. Let X 3 be the number of people
Amber will meet until she finds a person whose birthday is different from the first two
birthdays that she has already recorded. Then X 3 is geometric with p = 363/365.
In general, for 2 ≤ i ≤ 365, let Xi be the number of people Amber will meet until
she finds a person whose birthday is different from the first i − 1 birthdays she has
already recorded. Then Xi is a geometric random variable with
365 − (i − 1) 366 − i
p= = .
365 365
Clearly, X, the number of people that Amber will need to meet in order to achieve
her goal, satisfies the following equation.
X = X1 + X2 + X3 + · · · + X365.
Since the expected value of a geometric random variable with parameter p is 1/p, we
Chapter 10 Solutions to Self-Test Problems 13
have
1−p
Now, since the variance of a geometric random variable with parameter p is ,
p2
and since X1 , X2 , . . . , X365 are independent random variables,
365
* 365
*
Var(X) = Var(Xi ) = Var(Xi )
i=1 i=2
366 − i
365 1 −
* 365
*
365 365(i − 1)
= , 366 − i -2 = ≈ 216, 417.19.
(366 − i)2
i=2 i=2
365
"
Therefore, σX = Var(X) ≈ 465.20.
x 0 1 2 3 4 p X (x)
0 625/1296 0 0 0 0 625/1296
1 0 500/1296 0 0 0 500/1296
4 0 0 0 0 1/1296 1/1296
i 0 1 2 3 4 6 8 9 12 16
625 500 75 75 10 10 1
P (XY = i) 0 0 0
1296 1296 1296 1296 1296 1296 1296
Now we will calculate all the quantities that we need to find ρ(X, Y ). We have
9 √
716, 375 6480 28, 655
σY = = .
1, 679, 616 1, 679, 616
625 500 75 75
E(XY ) = 0 · +1· +2· +3·0+4·
1296 1296 1296 1296
10 10 1 31
+6· +8·0+9· + 12 · 0 + 16 · = .
1296 1296 1296 36
31 2 779 895
Cov(X, Y ) = E(XY ) − E(X)E(Y ) = − · = ,
36 3 1296 1944
Cov(X, Y ) 895/1944
ρ(X, Y ) = = √ / √ 0 ≈ 0.946.
σX σY 5 6480 28, 655
3 1, 679, 616
As expected, this shows that there is almost a linear relation, hence a strong positive
correlation, between X and Y .
SOLUTIONS TO SELF-QUIZZES AND SELF-TESTS
CHAPTER 11
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 11.1
This shows that Y is a binomial random variable with parameters n and q. This
should be obvious because Y is the number of failures in n independent Bernoulli
trials, where, for each trial, the probability of failure is q.
2. No! If MX (t),
# the$moment generating function of a random variable X exists, then
MX (0) = E e0 ·X = 1. However,
% 1 1 &5 % 7 &5
f (0) = + = "= 1.
3 4 12
So f cannot be a moment generating function.
It follows that
1 3 1 19α
E(X) = α · + 2α · + 3α · = ,
8 8 2 8
1 3 1 49α2
E(X 2) = α2 · + 4α2 · + 9α2 · = ,
8 8 2 8
4. By relation (11.2),
∞
' ∞
'
2n (2t)n
MX (t) = t =
n
= e2t .
n! n!
n=0 n=0
Section 11.2
1. Let Y be the total loss of the company from house fires in these regions next year.
Then Y = X1 + X2 + X3 and, by Theorem 11.7, Y ∼ N (6, 9). Therefore,
%4 − 6 6 − 6&
P (4 ≤ Y ≤ 6) = P ≤Z≤
3 3
2 t2 /2 2 t2 /2 2 t2 /2 2 t2 /2
= eα eα · · · eα = enα .
Clearly, MY (t) is the moment generating function of a standard normal random vari-
√
able if and only if nα 2 = 1 or α = 1/ n.
Section 11.3
1. Let X be the number of days prior to the appointment day that Margie needs to
call Dr. O’Neill’s clinic. We want to find the smallest positive integer n for which
P (X ≤ n) ≥ 0.95, or equivalently, P (X > n) ≤ 0.05. Since n is a positive integer,
P (X > n) ≤ 0.05 if and only if P (X ≥ n + 1) ≤ 0.05. By Chebyshev’s Inequality,
# $ 9
P (X ≥ n + 1) = P (X − 10 ≥ n − 9) ≤ P |X − 10| ≥ n − 9 ≤ .
(n − 9)2
Y = X1 + X2 + · · · + X120 .
Section 11.4
X1 + X2 + · · · + Xn a+b
lim = with probability 1,
n→∞ n 2
X1 + X2 + · · · + Xn (X1 + X2 + · · · + Xn )/n
lim = lim
n→∞ X1 + X2 + · · · + Xn
2 2 2 n→∞ (X12 + X22 + · · · + Xn2 )/n
(a + b)/2 3(a + b)
= = .
(a2 + ab + b )/3
2 2(a + ab + b2 )
2
2. Note that
%X & 1 1
n
E = E(Xn) = · np = p
n n n
%X & 1 1 p(1 − p)
n
Var = 2 Var(Xn ) = 2 · np(1 − p) = .
n n n n
Therefore, by Chebyshevs inequality,
%( X ( & Var(X /n) p(1 − p)
( n ( n
P ( − p( ≥ ε ≤ 2
= .
n ε nε2
This shows that, for all ε > 0,
%( X ( & %( X ( &
( n ( ( n (
P ( − p( > ε ≤ P ( − p( ≥ ε → 0.
n n
That is, X n /n converges to p in probability.
Section 11.5
1. Let n be the desired sample size, let X¯n be the sample mean, and µ be the population
mean. The actuary should find the smallest n for which
# $
P |X¯n − µ| < 300 ≥ 0.95
or
% −300 X¯n − µ 300 &
P (−300 < X¯n − µ < 300) = P √ < √ < √
4050/ n 4050/ n 4050/ n
% −2√n X¯n − µ
√
2 n&
=P < √ < ≥ 0.95.
27 4050/ n 27
X¯n − µ
Now by the version of the central limit theorem stated in Remark 11.3, √ is
4050/ n
approximately N (0, 1). So all we need to do is to find the smallest n for which
% 2 √n & % −2√n & % 2 √n &
Φ −Φ = 2Φ − 1 ≥ 0.95,
27 27 27
or % 2 √n &
Φ ≥ 0.975.
27
√
From Table 2 of the Appendix Tables, it follows that 2 n/27 ≥ 1.96 or n ≥
700.1316. Thus the actuary should sample 701 totaled vehicles to find the desired
estimate for µ.
Chapter 11 Solutions to Self-Quiz Problems 6
2. Let X1 , X2, . . . , X7000 be the amounts of contributions to the relief fund, respec-
tively. We want to find x so that
X1 + X2 + · · · + X7000 − 7000(220)
√
90 7000
is approximately a standard normal random variable. Since, from Table 2 of the
Appendix Tables, Φ(1.645) ≈ 0.95, we must have
x − 7000(220)
√ ≈ 1.645.
90 7000
This gives x ≈ 1, 552, 386.
Chapter 11 Solutions to Self-Test Problems 7
1. Let A be the event that next month the ombudsman will receive at least one complaint
from a freshman. Let B be the event that next month he or she will receive at least
one complaint from a sophomore. Let C be the event that next month the number of
complaints by the freshmen is greater than the number of complaints by the sopho-
mores. Let X and Y be the number of complaints next month from freshmen and
sophomores, respectively. We are given that if A occurs, then X ∼ N (2, 9), and if B
occurs, then Y ∼ N (3, 16). If A and B both occur, then X − Y ∼ N (−1, 25). The
probability we are interested in is
) )
1 0
1 ∞
= etx · ex dx + etx · e−x dx
2 −∞ 2 0
) )
1 0
1 ∞
= e(t+1)x dx + e(t−1)x dx.
2 −∞ 2 0
1 1 1
= (1 − 0) + (0 − 1) = , |t| < 1.
2(t + 1) 2(t − 1) 1 − t2
Chapter 11 Solutions to Self-Test Problems 8
1 1 1
= MU (t)MV (−t) = · = , |t| < 1.
1−t 1+t 1 − t2
Since, for |t| < 1, X and W have the same moment generating function, by
Theorem 11.2, they are identically distributed. Therefore, W is a Laplace random
variable.
= P (X1 ≤ t, X2 ≤ t, . . . Xn ≤ t)
% t − a &n
= P (X1 ≤ t)P (X2 ≤ t) · · · P (Xn ≤ t) = .
b−a
This implies that
# $ # $
P |Yn − Y | ≤ ε = P |Yn − b| ≤ ε = P (−ε ≤ Yn − b ≤ ε)
= P (b − ε ≤ Yn ≤ b + ε) = P (Yn ≤ b + ε) − P (Yn ≤ b − ε)
% b − ε − a &n
=1− .
b−a
% b − ε − a &n b−ε−a
As n → ∞, → 0, since < 1. So, as n → ∞,
b−a b−a
# $
P |Yn − Y | ≤ ε → 1
or, equivalently, # $
P |Yn − Y | > ε → 0.
That is, Yn Converges to Y in probability.
5 25 5
Var(eXi ) = − = .
3 16 48
Chapter 11 Solutions to Self-Test Problems 9
1 1 2
= = = .
E(Xi) 3/2 3
Note that the expected value of a gamma random variable with parameters r and λ is
r/λ.
# Y$ + (teY )n , tn
MX (t) = E(etX ) = E ete ≥ E = E(enY ).
n! n!
% 1 &
By part (b) of Example 11.5, MY (t) = exp µt + σ 2 t2 . Therefore,
2
tn tn % 1 &
MX (t) ≥ MY (n) = exp nµ + σ 2 n2
n! n! 2
1 % 1 & 1 + % 1 &,
= exp(n ln t) exp nµ + σ 2 n2 = exp n ln t + µ + σ 2 n .
n! 2 n! 2
Chapter 11 Solutions to Self-Test Problems 10
Now, for any given t > 0, since t and µ are fixed, for sufficiently large n, it follows
1
that ln t + µ + σ 2 n > 0. Hence
4
1 1 1 1
ln t + µ + σ 2 n = ln t + µ + σ 2 n + σ 2 n ≥ σ 2 n,
2 4 4 4
and thus, as n → ∞,
1 + %1 &, 1 %1 &
MX (t) ≥ exp n σ 2 n = exp σ 2 n2 → ∞.
n! 4 n! 4
This shows that the moment generating function of X does not exist in any neighbor-
hood (−δ, δ), δ > 0, around 0.
7. (a) we have
%X + X + · · · + X &
1 2 m
Var(X̄) = Var
m
m
' 1 1 1
= 2
Var(Xi) = m · 2 · p(1 − p) = p(1 − p).
m m m
i=1
(c) If the sizes of the n samples taken are all equal to 50, then by part (a), Var(p i) ≤
1 1 1
= . By part (b), we need to have 400σ 2 ≤ 0.04. Since σ 2 = Var(pi ) ≤
4(50) 200 n
1 1 2
, it suffices that 400 · ≤ 0.04 or n ≥ = 50.
200n 200n 0.04
8. Let X1, X2, . . . , Xn be independent Poisson random variables, each with parameter
1. By Theorem 11.5, X and X1 +X2 +· · ·+Xn are identically distributed. Therefore,
since n is large, by the central limit theorem,
P (X ≤ t) = P (X1 + X2 + · · · + Xn ≤ t)
%X + X + · · · + X − n · 1 t − n& %t − n&
1 2 n
=P √ ≤ √ ≈Φ √ .
1· n n n
Since for a random variable U , U ∼ N (n, n),
%U − n t − n& %t − n&
P (U ≤ t) = P √ ≤ √ =Φ √ ,
n n n
X is approximately N (n, n).
# $
Since for a random variable T , T ∼ N np, np(1 − p) ,
% T − np t − np & % t − np &
P (T ≤ t) = P * ≤* =Φ * ,
np(1 − p) np(1 − p np(1 − p)
# $
W is approximately N np, np(1 − p) .
9. Note that
1 t 3 2t 2 3t 1 4t 4 5t 1 6t
MX (t) = e + e + e + e + e + e .
12 12 12 12 12 12
Let Y be the sum of the outcomes when the die is tossed 8 times. Then, by Theorem
11.3,
%1 3 2t 2 3t 1 4t 4 5t 1 6t &8
MY (t) = et + e + e + e + e + e .
12 12 12 12 12 12
The coefficient of e11t in the expansion of M Y (t) is the probability that the sum of
the outcomes is 11. Note that the sum 11 is obtained if we obtain 7 ones
! and "a8four;
6 ones, a two, and a 3; or 5 ones and 3 twos. So, in the expansion of MX (t) , e11t
appears in exactly three terms. Using Theorem 2.6, the multinomial expansion, those
three terms are the terms of the following sums.
10. Let X 1 be the time until the first earthquake. For i ≥ 2, let X i be the time between
the (i − 1)st and the ith earthquakes. The sequence {X 1 , X2, . . .} is an independent
sequence of exponential random variables each with parameter λ. Let N be the num-
ber of earthquakes until the one that the bridge cannot withstand; N is a geometric
random variable with parameter p. Thus
P (N = i) = (1 − p)i−1 p, i = 1, 2, . . . .
-
The lifetime of the bridge is L = Nk=1 Xk . To identify the distribution function of
Chapter 11 Solutions to Self-Test Problems 13
p ' + λ(1 − p) ,i
∞
= .
1−p λ−t
i=1
λ(1 − p)
Restricting the domain of M L (t) to t < λp, we have that < 1 and hence,
λ−t
by the geometric series theorem,
λ(1 − p)
p λ−t λp
ML (t) = · = .
1−p λ(1 − p) λp − t
1−
λ−t
This shows that L, the lifetime of the bridge, is an exponential random variable with
parameter λp. For example, if the average time between two earthquakes is 6 months,
then, in years, 1/λ = 1/2, or λ = 2. If p = 1/200, then λp = 1/100, showing that
the lifetime of the bridge is, on average, 1/(λp) = 100 years.
SOLUTIONS TO SELF-QUIZZES AND SELF-TESTS
CHAPTER 12
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
Section 12.2
! "
1. Call
! an event a “success”
" if it belongs to N1 (t) : t ≥ 0 , a failure if it belongs
to N2 (t) : t ≥ 0 . Thus, by Theorem 12.3, an event is a success with probability
λ/(λ + µ) and is a failure with probability µ/(λ + µ). Clearly, the nth success occurs
before the mth failure if and only if there are at least n successes in the first n+m−1
events occurred. So the desired probability is
n+m−1
#
P (exactly i successes in the first n + m − 1 events)
i=n
# $
n+m−1 %
n + m − 1 & λ 'i & µ 'n+m−1−i
= .
i λ+µ λ+µ
i=n
2. Solution 1: By Theorem 12.4, the problem is equivalent to the following: 8 points are
selected at random from the interval [0, 4]. What is the probability that exactly one
point falls in each of the intervals (0, 1/2), (1/2, 1), . . . , (7/2, 4)? This is equivalent
to placing 8 balls randomly into 8 cells. The probability that each cell is occupied is
8!/88 = 0.0024. Another way to solve this is using Multinomial distribution, which
gives
8! & 1 '1 & 1 '1 & 1 '1 & 1 '1 & 1 '1 & 1 '1 & 1 '1 & 1 '1 8!
= 8 ≈ 0.0024.
1! 1! 1! 1! 1! 1! 1! 1! 2 2 2 2 2 2 2 2 8
Section 12.3
1. Clearly, the Markov chain is irreducible, aperiodic, and recurrent. Since it is finite
state, it is also positive recurrent. Let π 1 , π2 , and π3 be the proportion of days Dr. Fish
devotes to reading, gardening, and writing, respectively. Then, by Theorem 12.7, π 1 ,
π2 , and π3 are obtained from solving the system of equations
π1 0.30 0.40 0.25 π1
π2 = 0.25 0.10 0.40 π2
π3 0.45 0.50 0.35 π3
2. Let Y be the lifetime, in semesters, of a new computer installed in the lab. Then
6
Xn − 1 if Xn ≥ 1
Xn+1 =
Y −1 if Xn = 0
and for i ≥ 1,
6
1 if j = i − 1
pij = P (Xn+1 = j | Xn = i) =
0 if j $= i − 1.
Section 12.4
! "
1. Let X(t) be the number of components working at time t. Clearly, X(t) : t ≥ 0
is a continuous-time Markov chain with state space {0, 1, 2}. Let π 0 , π1 , and π2 be
the long-run proportion
! of time the process
" is in states 0, 1, and 2, respectively. The
balance equations for X(t) : t ≥ 0 are as follows:
0 λπ1 = µπ0
1 2λπ2 + µπ0 = µπ1 + λπ1
2 µπ1 = 2λπ2
µ µ2
From these equations, we obtain π 1 = π0 and π2 = 2 π0 . Using π 0 +π1 +π2 = 1
λ 2λ
yields
2λ2
π0 = 2 .
2λ + 2λµ + µ2
Hence the desired probability is
µ(2λ + µ)
1 − π0 = .
2λ2 + 2λµ + µ2
0 αµ1 π1 + µ2 π2 = λπ0
1 λπ0 = µ1 π1
2 (1 − α)µ1 π1 = µ2 π2
! "
1. By applying Theorem 12.9 to Y (t) : t ≥ 0 with t1 = 0, t2 = t, y1 = 0, y2 = y,
and t = s, it follows that
* + y−0 s
E Y (s) | Y (t) = y = 0 + (s − 0) = y,
t−0 t
and
* + (t − s)(s − 0) s
Var Y (s) | Y (t) = y = σ 2 · = σ 2 (t − s) .
t−0 t
√
2. Let the current price of the stock, per share, be v 0. Noting that 27.04 = 5.2, we
have
V (t) = v0 e3t+5.2W (t),
! "
where W (t) : t ≥ 0 is a standard Brownian motion. Hence W (t) ∼ N (0, t). The
desired probability is calculated as follows:
( ) ( )
P V (2) ≥ 2v0 = P v0 e6+5.2W (2) ≥ 2v0
( ) ( )
= P 6 + 5.2W (2) ≥ ln 2 = P W (2) ≥ −1.02
& W (2) − 0 '
=P √ ≥ −0.72
2
= P (Z ≥ −0.72) = 1 − P (Z < −0.72)
= 1 − Φ(−0.72) = 0.7642.
3. By Theorem 12.10,
9
( ) 2 x
P (U < x and T > y) = P no zeros in (x, y) = 1 − arccos .
π y
Chapter 12 Solutions to Self-Test Problems 6
1. Let
1 if the nth ball is drawn by Carmela
Xn = 2 if the nth ball is drawn by Daniela
3 if the nth ball is drawn by Lucrezia.
The process {Xn : n = 1, 2, . . .} is an irreducible, aperiodic, positive recurrent
Markov chain with transition probability matrix
7/31 11/31 13/31
P = 7/31 11/31 13/31 .
7/31 11/31 13/31
Let π1 , π2 , and π3 be the long-run proportion of balls drawn by Carmela, Daniela,
and Lucrezia, respectively. Intuitively, it should be clear that these quantities are
7/31, 11/31, and 13/31, respectively. However, that can be seen also by solving the
following matrix equation along with π 0 + π1 + π3 = 1.
π1 7/31 7/31 7/31 π1
π2 = 11/31 11/31 11/31 π2 .
π3 13/31 13/31 13/31 π3
! "
2. Let X(t) be the population size at time t. Then X(t) : t ≥ 0 is a birth and death
process with birth rates λ n = nλ, n ≥ 1, and death rates µn = nµ, n ≥ 0. For i ≥ 0,
let Hi be the time, starting from i, until the population size reaches i + 1 for the first
:
time. We are interested in 4i=1 E(Hi). Note that, by Lemma 12.2,
1 µi
E(Hi) = + E(Hi−1 ), i ≥ 1.
λi λi
Since E(H0) = 1/λ,
1 µ 1 1 µ
E(H1) = + · = + 2,
λ λ λ λ λ
1 2µ & 1 µ' 1 µ µ2
E(H2) = + · + 2 = + 2 + 3,
2λ 2λ λ λ 2λ λ λ
1 3µ & 1 µ µ2 ' 1 µ µ2 µ3
E(H3) = + + 2+ 3 = + 2+ + ,
3λ 3λ 2λ λ λ 3λ 2λ λ3 λ4
1 4µ & 1 µ µ2 µ3 ' 1 µ µ2 µ3 µ4
E(H4) = + + 2+ 3 + 4 = + + + + .
4λ 4λ 3λ 2λ λ λ 4λ 3λ2 2λ3 λ4 λ5
Therefore, the answer is
4
# 25λ4 + 34λ3µ + 30λ2 µ2 + 24λµ3 + 12µ4
E(Hi) = .
12λ5
i=1
Chapter 12 Solutions to Self-Test Problems 7
3. For n ≥ 0, let Hn be the time, starting from n, until the process enters state n + 1 for
the first time. Clearly, E(H0 ) = 1/λ and, by Lemma 12.2,
1
E(Hn) = + E(Hn−1 ), n ≥ 1.
λ
Hence
1
E(H0) =
,
λ
1 1 2
E(H1) = + = ,
λ λ λ
1 2 3
E(H2) = + = .
λ λ λ
Continuing this process, we obtain,
n+1
E(Hn) = , n ≥ 0.
λ
The desired quantity is
j−1
# j−1
# n+1 1* +
E(Hn) = = (i + 1) + (i + 2) + · · · + j
λ λ
n=i n=i
1* +
= (1 + 2 + · · · + j) − (1 + 2 + · · · + i)
λ
1 , j(j + 1) i(i + 1) - j(j + 1) − i(i + 1)
= − = .
λ 2 2 2λ
4. Let DN denote the state at which the last movie Mr. Gorfin watched was not a drama,
but the one before that was a drama. Define DD, N D, and N N similarly, and label
the states DD, DN , N D, and N N by 0, 1, 2, and 3, respectively. Let X n = 0 if
the nth and (n − 1)st movies Mr. Gorfin watched were both dramas. Define X n = 1,
2, and 3 similarly. Then {X n : n = 1, 2, . . .} is a Markov chain with state space
{0, 1, 2, 3} and transition probability matrix
7/8 1/8 0 0
0 0 1/2 1/2
P = 1/2 1/2 0
.
0
0 0 1/8 7/8
(a) If the first two movies Mr. Gorfin watched last weekend were dramas, the
probability that the fourth one is a drama is p 200 + p202 . Since
49/64 7/64 1/16 1/16
1/4 1/4 1/16 7/16
P2 = 7/16 1/16 1/4
,
1/4
1/16 1/16 7/64 49/64
the desired probability is (49/64) + (1/16) = 53/64.
Chapter 12 Solutions to Self-Test Problems 8
(b) Let π0 denote the long-run probability that Mr. Gorfin watches two dramas in
a row. Define π1 , π2 , and π3 similarly. We have that,
π0 7/8 0 1/2 0 π0
π1 1/8 0 1/2 0 π1
=
π2 0 1/2 0 1/8 π2 .
π3 0 1/2 0 7/8 π3
5. Every time a customer enters the bank for service, or a customer is served and departs
we say that an “event” occurs. Let us call an “event” a success if it is an arrival, and
a failure if it is a departure. Let S be the time until the next departure, and T be the
time until the next arrival. By memoryless property of exponential random variables,
S and T are independent exponential random variables with parameters µ and λ. The
next event is a departure with probability
; ∞ ; ∞
−µx
P (S < T ) = P (S < T | S = x)µe dx = P (x < T | S = x)µe−µx dx
0 0
; ∞ ; ∞
µ
= P (T > x)µe−µx dx = e−λx µe−µx dx = .
0 0 λ+µ
µ λ
P (T < S) = 1 − = .
λ+µ λ+µ
Since arrivals and departures occur independently, “events,” as defined in this exam-
ple, are Bernoulli trials with probability of success λ/(λ + µ). Considering these
facts, N = i if and only if, in a successive performance of independent Bernoulli
trials with probability of success λ/(λ + µ), the ith success occurs on the (n + i)th
trial. Hence, by the formula for the probability mass function of negative binomial
random variables,
&
µ 'n
i=0
λ+µ
P (N = i) = $ %
n + i − 1 & λ 'i & µ 'n
i ≥ 1.
i−1 λ+µ λ+µ