Pitman Solution
Pitman Solution
Pitman Solution
Probability
(Jim Pitman)
http://www2.imm.dtu.dk/courses/02405/ 17. december 2006
1 IMM - DTU
02405 Probability 2004-2-2 BFN/bfn
1 IMM - DTU
02405 Probability 2004-2-2 BFN/bfn
1 IMM - DTU
02405 Probability 2003-9-10 BFN/bfn
1 IMM - DTU
02405 Probability 2004-2-4 BFN/bfn
1 IMM - DTU
02405 Probability 2004-9-3 BFN/bfn
Exactly three (A B C)
1 IMM - DTU
02405 Probability 2004-2-4 KKA,BFN/bfn,kka
1 IMM - DTU
02405 Probability 2003-9-13 KKA/bfn,kka
1 IMM - DTU
02405 Probability 2003-9-18 BFN/bfn
1 IMM - DTU
02405 Probability 2003-9-11 BFN/bfn
1 IMM - DTU
02405 Probability 2003-9-11 BFN/bfn
1 IMM - DTU
02405 Probability 2004-2-4 BFN/bfn
true
1 IMM - DTU
02405 Probability 2003-9-13 KKA/bfn,kka
1 IMM - DTU
02405 Probability 2003-9-24 BFN/bfn
1 IMM - DTU
02405 Probability 2003-9-11 BFN/bfn
Question b) We introduce the event S Chip sold P (S) = 0.8 + 0.2 0.1 = 0.82 The probability in question is P (C c |S) = P (S|C c )P (C c ) 0.1 0.2 1 = = P (S|C c )P (C c ) + P (S|C)P (C) 0.02 + 1 0.8 41
1 IMM - DTU
02405 Probability 2003-9-13 BFN/bfn
and from the law of averaged conditional probabilities we get P (D) = P (H)P (D|H) + P (H c )P (D|H c ) = 0.99 0.05 + 0.01 0.8 = 0.0575 Question b) The proability in question P (H c Dc ) = P (H c )P (Dc |H c ) = 0.01 0.2 = 0.002 using the multiplication (chain) rule Question c) The proability in question P (H Dc ) = P (H)P (D c |H) = 0.99 0.95 = 0.9405 using the multiplication (chain) rule Question d) The probability in question is P (H c |D). We use Bayes rule to interchange the conditioning P (H c |D) = P (D|H c )P (H c ) = 0.8 0.010.008 + 0.05 0.99 = 0.139 P (D|H c )P (H c ) + P (D|H)P (H)
Question e) The probabilities are estimated as the percentage of a large group of people, which is indeed the frequency interpretation.
1 IMM - DTU
02405 Probability 2004-2-10 BFN/bfn
P (S) = P (S|T1 F )P (T1 F )+P (S|T2 F )P (T2 F )+P (S|T3 F )P (T3 F )+P (S|T1 F c )P (T1 F c )+P The last three terms are zero. We apply The Multiplication Rule for the probabilities P (Ti F ) leading to P (S) = P (S|T1 F )P (F |T1 )P (T1 )+P (S|T2 F )P (F |T2 )P (T2 )+P (S|T3 F )P (F |T3 )P (T3 ) a special case of The Multiplication Rule for n Events page 56. Inserting numbers P (S) = 1 111 111 121 + + = 233 223 233 4
Question b) The probability in question is P (T1 |S). Applying Bayes rule page 49 P (T1 |S) = P (S|T1 )P (T1 ) = P (S)
11 63 1 4
2 9
1 IMM - DTU
02405 Probability 2004-2-7 BFN/bfn
P (Dn ) =
i=1
i1 12
n 13
1 IMM - DTU
02405 Probability 2003-9-24 BFN/bfn
364 365
n1
Question c) In the birthday problem we only ask for two arbitrary birthdays to be the same, while the question in this exercise is that at least one out of n 1 has a certain birthday.
1 IMM - DTU
02405 Probability 2003-9-18 BFN/bfn
1 IMM - DTU
02405 Probability 2004-9-24 BFN/bfn
(use 1 = p1 p2 + q1 p2 + p1 q2 + q1 q2 to get the result in Pitman) Question b) P (Current ows) = P (((S1 S2 ) S3 )cupS4 ) = 1 (1 q1 q2 )q3 q4 (or use exclusion/inclusion like Pitman)
1 IMM - DTU
02405 Probability 2003-9-13 BFN/bfn
implying independence. The following is a formal and lengthy argument. Dene Aij as the the event that the ith person is born the jth day of the year.
1 We have P (Aij ) = 365 and that A1i , A2,j , A3,k , and A4,l are independent. The event Bij can be expressed by
B12 B23 = 365 (A1,k A2,k A3,k ) k=1 and by the independence of the As we get P (B12 B23 ) = question b) The probability P (B13 |B12 B23 ) = 1 = P (B13 ) thus, the events B12 , B13 , B23 are not independent. question c) Pairwise independence follows from a)
1 3652
1 IMM - DTU
02405 Probability 2004-2-10 BFN/bfn
Question b) The probability in question is given by the binomial distribution, see eg. page 81. 3 4 5 1 35 125 35 = = 0.0156 6 6 67
1 IMM - DTU
02405 Probability 2003-9-13 BFN/bfn
P (G2c ) = 1 P (G2) =
1 IMM - DTU
02405 Probability 2004-2-10 BFN/bfn
The probability P (B A) = P (A|B)P (B) by the multiplication rule, thus as a speical case of Bayes Rule page 49 we get P (B|A) = P (A|B)P (B) P (B A) = P (A) P (A)
Now the probability of P (A) is given by the binomial distribution page 81, as is P (B) and P (A|B) (the latter is the probability of getting 1 six in 3 rolls). Finally P (2 sixes in 5 rolls)P (1 six in 3 rolls) = P (B|A) = P (3 sixes in 8 rolls) 5 2
53 65
3 1
55 68
52 63
5 2
5 2 8 3
3 1
a hypergeometric probability. The result generalizes. If we have x successes in n trials then the probability of having y x successes in m n trials is given by m y n x The probabilities do not depend on p. nm xy
1 IMM - DTU
02405 Probability 2005-5-30 BFN/bfn
Question a) The probability of the event P (B4) = Question b) P (B4| 8 Bi ) = i=2 Question c) 6 2 0.72 0.34 = 0.0595 P ((B4 (8 Bi )) P (B4) i=2 = == 0.1363 8 P (i=2 Bi ) 1 P (B0) P (B1) 8765 4 4 0.7 0.3 = 0.1361 4321
1 IMM - DTU
02405 Probability 2004-2-10 BFN/bfn
= (1.05) (1.05) = 0.8531 (1 0.8531)0.7062 Question b) a = 210, b = 220 P (210 to 220 successes) = 220.5 200 10 209.5 200 10
= (2.05) (0.95) = 0.9798 0.8289 = 0.1509 Question c) a = 200, b = 200 P (200 successes) = 200.5 200 10 199.5 200 10
= (0.05) (0.05) = 0.5199 (1 0.5199) = 0.0398 Question d) a = 210, b = 210 P (210 successes) = 210.5 200 10 209.5 200 10
1 IMM - DTU
02405 Probability 2004-2-10 BFN/bfn
The value of a in the formula is 121 (more than 120). We get 120.5 100 8.165
1 IMM - DTU
02405 Probability 2003-9-19 BFN/bfn
we prefer to use the normal approximation, which is 1P (less than 390 working)=1
1 390 2 400 0.95 400 0.95 0.05
Without continuity correction we get 1 (2.29) = 0.0110 The skewness correction is: 1 1 1 2 0.95 1 2 (2.182 1) e 2 2.18 = 0.0048 6 400 0.95 0.95 2 The skewness correction is quite signicant and should be applied. Finally we approximate the probability in question with 0.00098, which is still somewhat dierent from the exact value of 0.0092. Question b) P (at least k)=1 With
1 k + 2 400 0.95 400 0.95 0.05
0.95
we nd k = 373.
1 IMM - DTU
02405 Probability 2003-9-24 BFN/bfn
1 + 1 2.5 2 25 0.09
= (
1 1 ) ( ) = 0.0266 30 30
1 IMM - DTU
02405 Probability 2003-9-19 BFN/bfn
k+1
The ratio is strictly decreasing in k. For < 1 maximum will be P (0), otherwise the probabilities will increase for all k such that > k, and decrease whenever < k. For non-integer the maximum of P (k) (the mode of the distribution) is obtained for the largest k < . For intger the value of P () = P ( + 1).
1 IMM - DTU
02405 Probability 2003-9-13 BFN/bfn
1+
5 3
= 0.5037
1 IMM - DTU
02405 Probability 2004-2-10 BFN/bfn
Question b) We apply the binomial distribution (sampling with replacement page 123) 4 6 20 30 24 36 10 P (Exactly 4 red tickets) = = 210 10 4 50 50 5
1 IMM - DTU
02405 Probability 2003-10-3 BFN/bfn
p1 =
(the hypergeometric distribution page 125). The probability that the second sample contains more than one bad item is calculated via the probability of the complementary event, i.e. that the second sample contains one or two bad items, which is 36 9 36 9 10 0 9 1 p2 = + 45 45 10 10 The answer to the question is the product of these two probabilities p 1 (1 p2 ) = 0.2804. Question b) The lot is accepted if we have no bad items in the rst sample or the event described under a) 10 0 50 5 40 5 10 1 50 5 40 4 9 0 45 10 36 10 9 1 45 10 36 9 = 0.4595
1 IMM - DTU
02405 Probability 2003-10-5 BFN/bfn
1 IMM - DTU
02405 Probability 2003-11-30 BFN/bfn
g1 41
p4 q g4
g = 4, 5, 6, 7
4 g=4
g1 3
q g4
Question c) The easiest way is rst answering question d) then using 1binocdf (3, 7, 2/3) in MATLAB. 0.8267 Question d) Imagine that all games are played etc. From the binomial formula p7 + 7p6 q + 21p5 q 2 + 35p4 q 3 = p7 + p6 q + 6p6 q + 6p5 q 2 + 15p5 q 2 + 35p4 q 3 = p6 + 6p5 q + 15p4 q 2 + 20p4 q 3 = p6 + p5 q + 5p5 q + 15p4 q 2 + 20p4 q 3 etc. Question e) P (G = 4) = p4 + q 4 P (G = 6) = 10p2 q 2 (p2 + q 2 ) Independence for p = q =
1 2
P (G = 5) = 4pq(p3 + q 3 ) P (G = 7) = 20p3 q 3 (p + q)
1 IMM - DTU
02405 Probability 2003-11-21 BFN/bfn
P (X +Y = n) =
i=0
P (X = i)P (X +Y = n|X = i) =
i=0
P (X = i)P (Y = ni)
where the last equality is due to the independence of X and Y . Question b) The marginal distribution of X and Y is P (X = 2) = 1 , 36 P (X = 3) = P (X = 6) = 1 , 18 5 , 36 P (X = 4) = P (X = 7) = 1 12 1 6
1 P (X = 5) = , 9 We get
P (X + Y = 8) = 2
1 1 1 5 + 36 36 18 9
35 1 1 = 12 12 16 81
1 IMM - DTU
02405 Probability 2003-10-6 BFN/bfn
Question b) We introduce p0 = P (X mod 3 = 0), p1 = P (X mod 3 = 1), p2 = P (X mod 3 = 2). The probability in question is p3 + p3 + p3 + 3p0 p1 p2 2 1 0 which after some manipulations can be written as 1 (p0 p1 + p0 p2 + p1 p2 3p0 p1 p2 ) The expressions can be maximized/minimized using standard methods, I havent found a more elegant solution than that.
1 IMM - DTU
02405 Probability 2004-5-13 BFN/bfn
E(X) =
x=0
x(X = x) = 0
5 6
+13
52 5 1 1 +23 3 +3 3 = 63 6 6 2
or realizing that X binomial 3, 1 example 7 page 169 we have E(X) = 3 1 = 6 6 1 . 2 Question b) Let Y denote the number of odd numbers on three rolls, then Y 1 binomial 3, 1 thus E(Y ) = 3 2 = 3 . 2 2
1 IMM - DTU
02405 Probability 2003-10-5 BFN/bfn
pi
i=0
1 IMM - DTU
02405 Probability 2003-10-12 BFN/bfn
P (D = 9) = P (D 9) P (D 8) =
= 0.2284
Question c) To calculate the mean we need the probabilities of P (D = i) for i = 3, 4, . . . , 13. We get 3 3 10 i3 13 i 10 i3 13 i
10! (13i)!(i3)! 13! (13i)!i!
P (D i) =
P (D = i) = P (D i)P (D i1) =
12
E(D) =
i=3
3 3(i 1)(i 2) = i 13 12 11 13 12 11
i(i1)(i2) =
i=3
3 6, 006 = 10.5 13 12 11
1 IMM - DTU
02405 Probability 2003-10-2 BFN/bfn
using the multiplication rule for Expectation page.177 valid for independent random variables. We have also used the fact that if X1 and X2 are independent then f (X1 ) and g(X2 ) are independent too, for arbitrary functions f () and g(). We now use the computational formula for the variance once more to get V ar(X1 X2 ) = (V ar(X1 ) + (E(X1 ))2 )(V ar(X2 ) + (E(X2 ))2 ) (E(X1 )E(X2 ))2 Now inserting the symbols of the exercise we get
2 2 2 2 V ar(X1 X2 ) = 1 2 + 2 2 + 2 1 1 2
1 IMM - DTU
02405 Probability 2003-10-5 BFN/bfn
1 IMM - DTU
02405 Probability 2003-10-5 BFN/bfn
1 IMM - DTU
02405 Probability 2003-10-5 BFN/bfn
1 IMM - DTU
02405 Probability 2003-10-13 BFN/bfn
=2+4=6
The values for V ar(N ) and E(N ) can be found p. 476 in the distribution summary.
1 IMM - DTU
02405 Probability 2003-10-13 BFN/bfn
1 IMM - DTU
02405 Probability 2003-10-5 BFN/bfn
Question b) We have three Poisson variates Xi : total number of chocolate chips and marshmallows in cookie i. According to our assumptions, X1 follows a Poisson distribution with parameter 6, while X2 and X3 follow a Poisson distribution with parameter 9. The complementary event is the event that we get two or three cookies without chocoloate chips and marshmallows. P (X1 = 0, X2 = 0, X3 = 0) + P (X1 > 1, X2 = 0, X3 = 0) +P (X1 = 0, X2 > 1, X3 = 0) + P (X1 = 0, X2 = 0, X3 > 1) = e6 e9 e9 + (1 e6 )e9 e9 + e6 (1 e9 )e9 + e6 e9 (1 e9 )=0 we are almost certain that we will get at most one cookie without goodies.
1 IMM - DTU
02405 Probability 2003-10-5 BFN/bfn
pi =
. 1p
1 pn+1 1p
Question b) As n we get n
n == pn1 +
1 IMM - DTU
02405 Probability 2003-10-15 BFN/bfn
x (1 x) dx =
x
0
2 i=0 n i=0
2 i n i
(x)i dx ai bni .
2
x
0
2 i=0
2 i
(x)
dx =
i=0
2 i
(x)
0
i+2
dx =
i=0
2 i
(1)
xi+3 i+3
x=1
=
x=0
1 30
0<x<1
xf (x)dx =
0 0
x30x2
i=0
2 i
(x)i dx = 30
i=0
2 i
(1)i
xi+4 i+4
x=1
=
x=0
1 2
which we could have stated directly due to the symmetry of f (x) around 1 , or 2 from page 478. Question c) We apply the computational formula for variances as restated page 261. V ar(X) = E(X 2 ) (E(X))2
1 2
E(X 2 ) =
0
x2 30x2
i=0
2 i
(x)i dx = 30
i=0
2 i
(1)i
xi+5 i+5
x=1
=
x=0
30 105
30 1 1 = 105 4 28 1 33 = + 3 + 1) 28
(3 +
3)2 (3
1 IMM - DTU
02405 Probability 2003-10-13 BFN/bfn
P (a X b) = We get
2
f (x)dx
a
P (1 X 2) = 1 = 2(1 x)
1 . 2
1 dx = 2(1 + |x|)2
0 1 x=2
1 dx + 2(1 x)2 =
2 0
1 dx 2(1 + x)2
x=0
1 + 2(1 + x) x=1
x=0
7 1 1 1 1 + = 2 4 2 6 12
x= x=1
1 IMM - DTU
02405 Probability 2003-10-13 BFN/bfn
1 IMM - DTU
02405 Probability 2003-11-10 BFN/bfn
We can nd c the standard way using 0.9 f (x)dx = 1. However, we can derive 1 the area of the triangle directly as 2 0.02 c such that c = 100. Due to the symmetry of f (x) we have P (X < 0.925) = P (1.075 < X).
0.925
1 10(x0.9)dx = 20 x2 0.9x 2
x=0.925
= 0.0625
x=0.9
Question b) We dene the random variable Y as the length of an item which has passed the quality inspection. The probability P (0.95 < Y < 1.05) = P (0.95 < X < 1.05) 0.75 = = 0.8 P (0.925 < X < 1.075) 0.9375
The number of acceptable items A out of c are binomially distributed. We determine c such that P (A 100) 0.95 We now use the normal approximation to get 1 100 0.5 0.8 c 0.4 c 0.95
1 IMM - DTU
02405 Probability 2003-10-23 BFN/bfn
Ti
i=1
We know from page 286 that T is Gamma distributed. However, it is more convenient to apply CLT (Central Limit Theorem) p.268 to get P (T > 11) = 1 P (T 11)=1 11 10
10 100
= 1 (1) = 0.1587
Question e) The sum of the lifetime of two components is Gamma distributed. From p.286 (Right tail probability) we get P (T1 + T2 > 22) = e0.122 (1 + 2.2) = 0.3546
1 IMM - DTU
02405 Probability 2003-10-30 BFN/bfn
1 e
1 e m
1 The mean is measured in m time units so we have to multiply with this fraction to get an approximate value for E(T )
E(T )= =
1 m +o m 1 e m 1 = = m 1 e m m1 1 m +o m
1 IMM - DTU
02405 Probability 2003-11-12 BFN/bfn
u1 du
= e[u
]u=t u=0
= et
Similarly we derive f (t) from G(t) using (5) page 297 f (t) = Finally from (6) page 297 t1 et = t1 (t) = et
dG(t) = et t1 = t1 et dt
1 IMM - DTU
02405 Probability 2003-10-15 BFN/bfn
0<y<1
The last equality follows from the cumulative distribution function (CDF) of a Uniformly distributed random variable (page 487). The density is derived from the CDF by dierentation (page 313) and fU 2 (y) = 1 dFU 2 (y) = ,0 < y < 1 dy 2 y
1 IMM - DTU
02405 Probability 2003-11-12 BFN/bfn
1 1 1 dy = ln (1 + a2 ) for a 1 + y2 2
The integral yfY (y)dy has to converge absolutely for E(Y ) to exist, i.e. E(Y ) exists if and only if E(|Y |) exists (e.g. page 263 bottom).
1 IMM - DTU
02405 Probability 2003-10-16 BFN/bfn
1 IMM - DTU
02405 Probability 2003-10-29 BFN/bfn
a Weibull distribution. See e.g. exercise 4.3.4 page 301 and exercise 4.4.9 page 310. Question b)
0
2
2y 2 ey dy =
y 2 ey dy
We note the similarity with the variance of an unbiased (zero mean) normal variable.
y e
2 y 2
dy =
2 2
1 1 2 2 e 1 2
y2 1 2
dy =
1 1 1 2 y1 y e 2 dy 2 1 2
1 the integral is the expected value of Z 2 , where Z is normal 0, 2 distributed. 1 Thus the value of the integral is 2 Finally we get
E(Y ) = =
E(Z 2 ) =
V ar(Z) with = 3
1 1 == 2 2
= 0.51
Question c) We apply the inverse distribution function method suggested page 320323. Thus 1 U = 1 eX X = ln (1 U ) Now 1 U and U are identically distributed such that we can generate an expo1 nential X with X = ln (U ). To generate a Weibull ( = 2) distributed Y we take the square root of X, thus Y =
1 ln (1 U ).
1 IMM - DTU
02405 Probability 2003-10-17 BFN/bfn
Question b) In this case we have S = min(X1 , X2 ) and we apply the result for the minimum of random variables page 317. The special case of two exponentials is treated in example 3 page 317 P (S t) = 1 e(1 +2 )t Question c) From the system design we deduce S = max (min (X1 , X2 ), min (X3 , X4 )) such that P (S t) = 1 e(1 +2 )t 1 e(3 +4 )t Question d) Here S = min (max (X1 , X2 ), X3 ) such that P (S t) = 1 1 1 e1 t 1 e2 t e3 t = 1e(1 +3 )t e(2 +3 )t +e(1 +2 +3 )t
1 IMM - DTU
02405 Probability 2003-11-1 BFN/bfn
1 IMM - DTU
02405 Probability 2003-11-12 BFN/bfn
Question b) From the boxed result at the bottom of page 327 we have that (X (k) has beta(k, n k + 1) distribution. Substituting r = k and s = n k + 1 we get
r+s1
i=r
r+s1 i
xi (1 x)s+ri1
Question c) The beta(r, s) density is 1 1 xr1 (1 x)s1 = xr1 f (x) = B(r, s) B(r, s) Now
x x s1
i=0
s1 i
(x)i
P (X(k) x) = 1 = B(r, s)
s1 0 x
f (x)dx =
0 0
1 ur1 B(r, s)
s1
i=0
s1 i
s1
i=0
s1 i
(u)
r+i1
xr du = B(r, s)
i=0
s1 i
as was to be proved.
1 IMM - DTU
02405 Probability 2003-11-19 BFN/bfn
7 16
Question b) We see that the probability can be rewritten This is example 2 page 343 with dierent values. We get 1 Question c) P (Y X|Y > 0.25) = P (Y X) P (Y X, Y 0.25) P (Y X, Y > 0.25) = P (Y > 0.25) P (Y > 0.25) =
1 2
13 14 9 = 24 25 40
1 2 3 4
1 2 4
5 8
1 IMM - DTU
02405 Probability 2003-11-1 BFN/bfn
1 IMM - DTU
02405 Probability 2003-10-17 BFN/bfn
With R1 and R2 indpendent we have the joint density from (2) page 350 f (r1 , r2 ) = We now integrate over the set r2 < P R1 R2 2
r1 2
4r1 r2 4
=
0 0
r1 2
1 8
1 IMM - DTU
02405 Probability 2003-11-20 BFN/bfn
We recognize this as a gamma density (1) page 286 with = 1 and r = 4 thus 1 c= 8 Question b) With Z = g(Y ) = 4Y 3 , result page 304 we get dg(y) = 12y 2 , Y = dy
1
Z 4
1 3
z 3 1 y 3 y 1 ( z ) 3 4 4 fZ (z) = e = e 6 12y 2 72
1 IMM - DTU
02405 Probability 2003-11-22 BFN/bfn
g(y)f (y)dy
where f (y) is the density of Y . Here Y is exponential(1) distributed with density f (y) = 1 e1y . We get E e2Y =
0
e2y 1 ey dy =
1 IMM - DTU
02405 Probability 2003-11-1 BFN/bfn
f (u, v)dudv
from the fundamental theorem of calculus. Question d) The result follows from (2) page 350 by integration.
x y x y
F (x, y) =
fX (x)fY (y)dydx =
fX (x)dx
Alternatively dene the indicator I(x, y) variables such that I(x, y) = 1 if X x and Y y and 0 otherwise. Note that F (x, y) = P (I(x, y) = 1) = E(I(x, y)) and apply the last formula on page 349. Question e) See also exercise 4.6.3 c). We nd F (x, y) = P (U(1) x, U(n) y) = P (U(n) y) P (U(1) > x, U(n) y) P (U(n) y) P (x < U1 y, x < U2 y, . . . , x < Un y) = y n (y x)n We nd the density as d2 F (x, y) = n(n 1)(y x)n2 dxdy
1 IMM - DTU
02405 Probability 2004-4-15 BFN/bfn
1 IMM - DTU
02405 Probability 2003-11-19 BFN/bfn
1 IMM - DTU
02405 Probability 2003-11-22 BFN/bfn
y = g(z) = z 2 ,
z=
y,
dy = 2z = 2 y dz
Inserting in the boxed formula page 304 and use the many to one extension. fY (y) =
y 1 e 2 2y
0<y<
We recognize the gamma density with scale parameter = 1 and shape parameter 2 1 r = 2 from the distribution summary page 481. By a slight reformulation we have
y 1 2 2 e 2 fY (y) = 2
1 2
Question b) The formula is valid for n = 1. Assuming the formula valid for odd n we get n n+2 = +1 Gamma 2 2 The recursive formula for the gamma-function page 191 tells us that (r + 1) = r(r) and we derive n (n 1)! n+2 = Gamma 2 2 2n1 n1 ! 2 n = 2
n1 2
i=1
1 2
Question c) Obvious by a simple change of variable. Question d) From the additivity of the gamma distribution, which we can prove directly Question e) From the interpretation as sums of squared normal variables.
2
r Question f ) The mean of a gamma (r, ) distribution is , thus n has mean
n 2 1 2
= n.
The variance of a gamma (r, ) distribution is n 2 1 = 2n. Skewness bla bla bla
4
r , 2
1 IMM - DTU
02405 Probability 2004-4-15 BFN/bfn
f (t) =
0
eu e(tu) du = et
0
eu() du =
et et
1 1 +
See e.g. page 480 for the means E(Xi ) for the exponential variables . Question c) Using the independence of X1 and X2 we have V ar(Z) = V ar(X1 ) + V ar(X2 ) = The last equalit follows from e.g. page 480. 1 1 + 2 2
1 IMM - DTU
02405 Probability 2003-11-20 BFN/bfn
1 IMM - DTU
02405 Probability 2003-11-11 BFN/bfn
Xi =
j=1
Wij
Xi =
i=1 i=1 j=1
Wij
n i=1 ri , )
a sum of distributed.
n i=1 ri
1 IMM - DTU
02405 Probability 2004-4-17 BFN/bfn
where we have used the independence of X1 and X2 and the last equality. Now using the Poisson probability expression and the boxed result page 226 P (X1 = x1 |X1 + X2 = n) =
nx x1 2 1 n! 1 = = x1 !(n x1 )! (1 + 2 )n 1 1 1 2 1 2 e (nx1 )! e x1 ! (1 +2 )n (1 +2 ) e n!
x nx
n x1
px1 (1 p)nx1
with p =
1 . 1 +2
Question b) Let Xi denote the number of eggs laid by insect i. The probability in question is P (X1 90) = P (X2 60). Now Xi binomial 150, 1 . With the 2 normal approximation to the binomial distribution page 99 to get P (X2 60) = 60 + 1 150 2 1 150 2
1 2
29 150
= (2.37) = 0.0089
1 IMM - DTU
02405 Probability 2003-11-19 BFN/bfn
P (N1 = n1 , N2 = n2 , . . . Nm = nm |
Ni = n)
i=1
P (N1 = n1 , N2 = n2 , . . . Nm = nm P ( m Ni = n) i=1
m i=1
Ni = n)
Now realising that P (N1 = n1 , N2 = n2 , . . . Nm = nm m Ni = n) = P (N1 = i=1 m n1 , N2 = n2 , . . . Nm = nm ) and using the fact that N = i=1 Ni has Poisson m distribution with parameter = i=1 i we get
m
Ni = n) =
i=1
m i i i i=1 ni ! e (
m n i=1 i m i=1 ni )!
ni
m
P (N1 = n1 , N2 = n2 , . . . Nm = nm |
Ni = n) =
i=1
n! n1 !n2 ! nm !
i .
n1
n2
nm
P (N1 = n1 , N2 = n2 , . . . Nm = nm ) = P (N = n)P (N1 = n1 , N2 = n2 , . . . Nm = nm | we see that the Ni s are independent Poisson variables.
Ni = n)
i=1
1 IMM - DTU
02405 Probability 2003-11-19 BFN/bfn
(yE(Y ))2
x
f (x, y) =
x y
We now apply the crucial idea of adding 0 in the form of E(Y |x) E(Y |x) inside the brackets. V ar(Y ) = (y E(Y |x) + E(Y |x) E(Y ))2 f (x, y)
x y
f (x) f (x)
f (x,y) f (x)
since
y (y
E(Y |x)) = 0. Now (y E(Y |x))2 fY (y|x) f (x)+ (E(Y |x) E(Y ))2 fY (y|x) f (x)
V ar(Y ) =
x y
the inner part of the rst term is V ar(Y |X = x) while the inner part of the second term is constant. Thus V ar(Y ) =
x
leading to the stated equation V ar(Y ) = E(V ar(Y |X)) + V ar(E(Y |X)) an important and very useful result that is also valid for continuous and mixed distributions. Mixed distributions are distributions that are neither discrete nor continuous.
1 IMM - DTU
02405 Probability 2003-11-19 BFN/bfn
Question c) Since Y for given X = x is uniformly distributed we can apply results for the uniform distribution, see e.g. the distribution summary page 477 or 487. We get 1 |x| E(Y |X = x) = 2 Question c) Similarly V ar(Y |X = x) = (1 |x|)2 12
1 IMM - DTU
02405 Probability 2003-11-19 BFN/bfn
P (X1 = x1 , X2 = x2 , . . . , Xn = xn ) =
i=1
pXi (1 p)1Xi = p
n i=1
Xi
(1 p)n
n i=1
Xi
1 0
Xi
f (p|X1 = x1 , X2 = x2 , . . . , Xn = xn ) =
Xi (1
(1 p)n
n i=1 n i=1
Xi
f (p)
p)n
Xi f (p)dp n i=1
which only dependes on the Xi s through their sum. Introducing Sn = rewrite f (p|X1 = x1 , X2 = x2 , . . . , Xn = xn ) =
1 0
Xi we
We note that if the prior density of p f (p) is a beta(r, s) distribution, then the posterior distribution is a beta(r + Sn , s + n Sn ) distribution.
1 IMM - DTU
02405 Probability 2003-11-12 BFN/bfn
1 IMM - DTU
02405 Probability 2003-11-11 BFN/bfn
1 IMM - DTU
02405 Probability 2004-5-13 BFN/bfn
Probability
1 3 1 6 1 3 1 6
X2 + X 3 / X 2 X 3 0 1 2
0 0
1 6 1 3
1 0 0
1 3 1 = 6. 1 3
Question c) X2 and X3 are independent thus uncorrelated. The new variables Z 1 = 2 2 X2 + X3 and Z2 = X2 X3 are correlated. E(Z1 Z2 ) = E(X2 ) E(X3 ) = 1 1 = 2 3 1 1 = 5 6 = E(Z1 )E(Z2 ) 6 6
1 IMM - DTU
02405 Probability 2003-11-11 BFN/bfn
where X and Z are indpendent standard normal variables. Thus X + 2Y = 2X + 3Z This is the sum of two independent normal variables which itself is N ormal(0, 2 2 + 2 3 ) distributed. Thus P (X + 2Y 3) = 3 7 = (1.13) = 0.8708
1 IMM - DTU
02405 Probability 2004-5-13 BFN/bfn
P (X > kY ) = P (X kY > 0)
where we have used X 2 + Y 2 exponential(0.5) in the last equality (page 360, 364-366, 485). Question d) X =v+ 3Y normal(v, 3)
1 IMM - DTU
02405 Probability 2003-12-12 BFN/bfn
Question b) We have from question a) that V1 = aV + bW = 1 + 11 X + 12 Z W1 = cV + dW = 2 + 21 X + 22 Z for some appropriate constants. We can rewrite these expressions to get V1 1 11 X + 12 Z W1 2 21 X + 22 Z = = Y1 = X1 = 2 2 2 2 2 2 2 2 11 + 12 11 + 12 21 + 22 21 + 22 such that X1 and Y1 are standard normal variables. We see that with some eort we would be able to write Y1 = 1 X 1 + 1 2 Z1 1
and we conclude from page 454 that V1 and W2 are bivariate normal variables. Question c) We nd the parameters using standard results for mean and variance 1 = E(aV + bW ) = aV + bW
2 1
2 = E(cV + dW ) = cV + bW
2 2 2 2 = c2 V + d2 W + 2cdV W
2 a 2 V
2 b 2 W
+ 2abV W
We nd the covariance from = E[(a(V V ) + b(W bW ))(c(V V ) + d(W W ))] E((aV + bW (aV + bW ))(cV + dW (cV + dW )))
etc
1 IMM - DTU
02405 Probability 2003-9-11 BFN/bfn
1 IMM - DTU
02405 Probability 2004-2-10 BFN/bfn
1 IMM - DTU
02405 Probability 2004-10-16 BFN/bfn
Question a) P (E2 ) = P (A2 ) + P (B2 ) + P (C2 ) + P (D2 ) = p2 + p2 + p2 + p2 = 0.3816 a b c d Question b) We have p(k) = P (Ek ). By combinatorial considerations we can show P (Ai1 Bi2 Ci3 Di4 ) = (i1 + i2 + i3 + i4 )! i1 i2 i3 i4 pa pb pc pd i1 !i2 !i3 !i4 !
with i1 + i2 + i3 + i4 = 4, in our case. We have to sum over the appropriate values of (i1 , i2 , i3 , i4 ). It is doable but much more cumbersome to use basic rules. We get p(1) = 0.0687 p(2) = 0.5973 p(3) = 0.3163 p(4) = 0.0177
p(1) = P (E1 ) = P (A4 ) + P (B4 ) + P (C4 ) + P (D4 ) = p4 + p4 + p4 + p4 = 0.0687 a b c d p(4) = P (E4 ) = P (A1 B1 C1 D1 ) = 24pa pb pc pd = 0.0177 To calculate p(3) = P (E3 ) we use the law of averaged conditional probabilities
4
p(3) = P (E3 ) =
i=0
2 To establish P (E3 |A2 ) we argue P (E3 |A2 ) = P (B1 C1 |A2 )+P (B1 D1 |A2 )+P (C1 D1 |A2 ) = further P (E3 |A0 ) = P (B2 C1 D1 |A0 )+P (B1 C2 D1 |A0 )+P (B1 C1 D2 |A0 ) = 4pb pc pd (pb + pc + pd (1 pa )4 pb pc + pb pd + pc pd (1 pa )2
To evaluate P (E3|A1 ) we use the law of averaged conditional probability once more (see Review Exercise 1.13)
4
P (E3|A1) =
i=1
p2 + p2 c d (1 pa pb )2 pc + pd P (E3|A1 B2 ) = 1 pa pb P (E3|A1 B3 ) = 0
p2 + p2 c d (1 pa pb )2
1 IMM - DTU
02405 Probability 2003-10-2 BFN/bfn
1 IMM - DTU
02405 Probability 2003-10-12 BFN/bfn
P (replace) =
i=3
50 i
0.01 0.99
50i
=1
i=0
50 i
0.01i 0.9950i
= 0.9950 1 +
= 0.0138
Pitman claims this probability to be 0.0144. We evaluate the second probability using the Normal approximation to the Binomial distribution. Let X denote the number of packets the manufacturer has to replace. The random variable X follows a Binomial distribution with n = 4000 and p =. We can evaluate the probability using the normal approximation. P (X > 40) = 1 P (X 40)=1 1 14.77 7.38
1 40 + 2 4000 0.0138 4000 0.0138 0.9862
= 1 (2.00) = 0.9772
sprpr3
r2 2
r3 3
pr2f !7`
j l Iyx
(
wvv
r h t h x Dprym t h n} x h o n DIpIrDIp
rpmhrDIDpIphpIpspruDpzBu h x l n n t j } x l h o n ~ l h q h x v v u U v u y U h fr h
n h h r
v Cu
~} n | h | j h n n t l x n t j x n x o pDCDrpC`DIyppzCryIpn n l h q h h o n j l x ~} n | h | j h n n t l x n h x t n v ~} n | x { h n x o n v u t n l h q h h o n h l k h j h ur8IpXIypDCrpC`DIpDpuCpDIfhDp8zympwxDsr8IpmIig
f c e ed
g e q s c bhd
bx!vs s e q s y
w s e t s q q W xvufrhpi
g W e c a Y W hfdb`XV
1 IMM - DTU
02405 Probability 2003-10-12 BFN/bfn
Question e) Pitman suggests no, which is reasonable. However, the way to assess whether we can assume independence or not would be to analyze the distribution of the number of sets played in a large number of matches.
1 IMM - DTU
02405 Probability 2003-10-3 BFN/bfn
1 IMM - DTU
02405 Probability 2003-10-15 BFN/bfn
i=20
1000 i
1 38
37 38
1000i
1 37 Question b) The standard deviation 1000 38 38 =5.1 is acceptable for the Normal approximation. 1 1 1000 1 20 2 38 35 + 2 1000 38 = (1.814) (1.346) = 0.8764 1 37 1 37 1000 38 38 1000 38 38
1 IMM - DTU
02405 Probability 2003-10-13 BFN/bfn
P (X = x)P (Y X|X = x)
P (X = x)P (Y x)
There is a convenient formula for the tail probabilities of a geometric distribution, see eg. page 482. We need to adjust this result to the present case of a geometric distribution with range 0, 1, . . . (counting only failures), such that P (Y x) = (1 p)x . We now insert this result and the Poisson densities to get P (Y X) =
x=0
x e (1 p)x = e e(1p) = ep x!
((1p))x x=0 x!
= e(1p) .
ep = e 2 = 0.6065
1 IMM - DTU
02405 Probability 2003-10-13 BFN/bfn
1
6 36 4 36 2 36
2
3 36 2 36 1 36
as a check we verify that the sum of all entries in the table is 1. We derive the distribution of Y1 + Y2 Y1 + Y 2 = i 0 1 2 3 4 4 1 9 P (Y1 + Y2 = i) 36 12 10 36 36 36 36 Question b) E(3Y1 +2Y2 ) = E(3Y1 )+E(2Y2 ) = 3E(Y1 )+2E(Y2 ) = 5E(Y1 ) = 5 0 1 1 1 +1 +2 2 3 6 = 10 3
The rst equality is true due to the addition rule for expectations (page 181), the second equality is true due to the result for linear functions of random variables page 175 b., the third equality is true since Y1 and Y2 has the same distribution, and the fourth equality is obtained from the denition of the mean see page 181. Question c) for X 3 0 1 for 4 X 5 f (x) = 2 for X = 6
or something similar.
1 IMM - DTU
02405 Probability 2003-11-6 BFN/bfn
+ w + jd)
Question b) To obtain the distribution of Sn the number of black balls drawn, we n dierent sequences each with the probability derived in note that there is k question a) that lead to the event Sn = k. P (Sn = k) = n k
k1 j=0 (b
+ jd)
k1 j=0 (b
nk1 (w j=0
+ jd)
+ w + jd)
Question d) Not independent since, but interchangeable Question e) We approach the question by induction. We rst show P (X1 = 1) = b b+w
P (Xn+1 = 1) = P (Xn+1 = 1|X1 = 1)P (X1 = 1)+P (Xn+1 = 1|X1 = 0)P (X1 = 0) = P (Xn+1 = 1|X1 = To proceed we note that the probability P (Xn+1 = 1|X1 = 1) is the probability of P (Yn = 1) in an urn scheme starting with b + d blacks and w whites, thus b+d P (Xn+1 = 1|X1 = 1) = P (Yn = 1) = b+w+d . Correspondingly P (Xn+1 = 1|X1 = b 0) = b+w+d . Finally P (Xn+1 = 1) = b+d b w b b + = b+w+db+w b+w+db+w b+w
2 Question f ) P (X5 = 1|X10 = 1) = P (X10 = 1|X5 = 1)P (X5 = 1) = P (X10 = 1|X5 = 1) P (X10 = 1)
using Bayes rule, or from the exchangeability. From the exchangeability we also have b+d P (X10 = 1|X5 = 1) = P (X2 = 1|X1 = 1) = b+w+d
1 IMM - DTU
02405 Probability 2003-10-22 BFN/bfn
z x P (X = x)
However, this is a power series in z that is absolutely convergent for |z| 1 and thus denes a C function of z for |z| < 1. Question b) The more elegant and maybe more abstract proof is GX+Y (z) = E z X+Y = E z X z Y From the independence of X and Y we get (page 177) GX+Y (z) == E z X E z Y = GX (z)GY (z) The more crude analytic proof goes as follows GX+Y (z) = E z
X+Y k=0 k=0 k
z P (X+Y = k) =
k i=0
P (X = i, Y = k i)
k i=0
P (X = i)P (Y = k i)
z k P (X = i)P (Y = ki)
i=0 k=i
The interchange of the sums are justied since all terms are positive. The rearrangement is a commonly used tool in analytic derivations in probability. It is quite instructive to draw a small diagram to verify the limits of the sums. We now make further rearrangements GX+Y (z) =
i=0
i=0 k=i
z k P (X = i)P (Y = k i)
i=0
z i P (X = i)
k=i
z ki P (Y = k i) =
z i P (X = i)
m=0
z m P (Y = m)
z i P (X = i)
z m P (Y = m) =
i=0
i=0
m=0
GSn (z) =
i=1
GXi (z)
E(z ) =
x=0
z x P (X = x) = z 0 (1 p) + z 1 p = 1 p(1 z)
Now using the general result for Xi with binomial distribution b(ni , p) we get E(z Xi ) = (E(z X ))ni = (1 p(1 z))ni Generalizing this result we nd E(z Sn ) = (1 p(1 z))
n i=1
ni
i.e. that the sum of independent binomially distributed random variables is itself binomially distributed provided equality of the p i s. Question d) The generating function of the Poisson distribution is given in exercise 3.5.19. Such that
n
GSn (z) =
i=1
ei (1z) = e
n i=1
i (1z)
The result proofs that the sum of independent Poisson random variables is itself Poisson. Question e) GX (z) = Question f ) G Sn = zp 1 z(1 p) zp 1 z(1 p) G Sn = zp 1 z(1 p)
n i=1 ri
1 IMM - DTU
02405 Probability 2003-11-1 BFN/bfn
f (x)dx =
e|x| dx = 1. We have =
since
Question b) We immediately get E(X) = 0 since f (x) is symmetric around zero. The second moment E(X 2 ) is identical to the second moment of the standard exponential, which we can nd from the computational formula for the variance. We additionally have V ar(X) = E(X 2 ) since E(X) = 0. 1 V ar(X) = E(X ) = 2 +
2
2 2
t e dt = 2
et dt = ey
the standard exponential survival function. Question d) From the result in c) we are lead to P (X x) = 0.5
1 x e 2 1 + 2 ex
x<0 0<x
1 IMM - DTU
02405 Probability 2003-10-16 BFN/bfn
Question b) The sum of two indpendent Poisson random variables is Poisson distributed (boxed result page 226), leading to P (Nloc (3) + Ndis (3) = 50) = ((loc + dis )3)50 ( + )3 e loc dis 50!
Question c) We now introduce the random variables Siloc and Sidis as the time of the ith local and long distance call respectively. These random variables are Gamma distributed according to the box on the top of page 286 or to 4. page 289 The probability in question can be expressed as The waiting time to the rst long distance in terms of calls are geometrically distributed P (X > 10) = (1 pdis )10 = loc loc + dis
10
1 IMM - DTU
02405 Probability 2003-11-2 BFN/bfn
a new Weibull distribution with = 2 and = 1. If we did not recognize the distribution as a Weibull we would derive the survival function of the R i s by P (Ri > x) =
x
ue 2 u du = e 2 x
We nd the density using (5) page 297 or directly using E4.3.4 (i) fY (y) = 2yey
2
Question b) This is a special case of E4.4.9 a). We can re-derive this result using the dg(y) = 2y. change of variable formula page 304. With Z = g(Y ) = Y 2 we get dy Inserting we get 2 1 = ez fZ (z) = 2yey 2y an exponential(1) distribution. Question c) We have E(Z) = 1 (see e.g. the mean of an exponential variable page 279 or the distribution summary page 477 or page 480).
1 IMM - DTU
02405 Probability 2003-10-16 BFN/bfn
where we have used standard rules for mean and variance see eg. page 249, and the result page 279 for the variance of the exponential distribution. Question b) We get the density fM (m) of the random variable M is 1 1 fM (m) = e 2 (m3) 2 m > 3.
from the stated assumptions. We can apply the box page 304 to get
1 2 (log (x)3) e2 e fM (m) 2 2 fX (x) = = = , dx x x x dm
1 3
x > e3
where X = g(M ) = eM . Alternatively FX (x) = P (X x) = P (log (X) log (x) = P (log (X) 3 log (x) 3) = P (Y log (x) 3) = 1 e taking derivative we get
e dFX (x) 2 == , fX (x) = dx x x
3 2 (log (x)3) 2
e2 =1 x
x > e3
x > e3
Question c) We do the calculations in terms of the random variables Y i = Mi 3, Mi = log (Xi ). Here Xi denotes the magnitude of the ith earthquake. From Example 3 page 317 we know that the minimum Z of the Yi s, Z = min (Y1 , Y2 ) is exponentially distributed with mean 1. P (M > 4) = P (Z > 1) = e1
1 IMM - DTU
02405 Probability 2003-11-1 BFN/bfn
P (Y y) = P
1 2
(Y y|U
1 2
1 +P 2
1 <U 2
1 (Y y| < U 2
= 2P (U y)
0 elsewhere.
Question b) The standard uniform density f (y) = 1 for 0 < y < 1, 0 elsewhere. Question c) E(Y ) =
1 2
0 1 = , V ar(Y ) = 2 4
1 2
0 12
1 48
1 IMM - DTU
02405 Probability 2003-11-12 BFN/bfn
where we have used that X has Gamma (4,2) density, and apply the formula for E(g(X, Y )) page 349. Question b) Since X and Y are independent we nd E(Wt2 ) E(Wt2 ) = E(X 2 )E etY
2
where E(X 2 ) = V ar(X) + (E(X))2 = 5, see eg. page 481. Next we derive E etY
2
e2t t e 1 t
and apply the computational formula for the variance page 261 SD(Wt ) = 5 e2t t 2et t e2 1 (e 1) 2 t t
2
1 IMM - DTU
02405 Probability 2003-11-1 BFN/bfn
The joint density of X and Y is the product of the marginal densities since X and Y are independent (page 349). We calculate the denominator using the formula for the probability of a set B page 349
1 1 x2 1
1 1 dydx =
(1 x2 )dx = 1
1 2 = 3 3
1 Y X2 2
= P (Y X 2 ) P
Y <
1 Y X2 2
x2
1 dydx =
1 2
1 ( x2 )dx 2
1 11 1 1 1 = 2 2 32 2 3 2
2 3
1 3 2
2 3
=1
2 4
1 IMM - DTU
02405 Probability 2003-10-17 BFN/bfn
the cumulative distribution function of an exponentially distributed random variable with parameter 1 + 2 . Question b) This question is Example 2 page 352. A slightly dierent handling of the integrals gives us P (T1 < T2 ) =
0 0 t1 0
1 e1 t1 e2 t1 dt1 =
which is an application of the rule of averaged conditional probability (page 41) for a continuous density. The general result is stated page 417 as the Integral Conditioning Formula. We get P (T1 < T2 ) =
0
1 e1 t1 e2 t1 dt1 =
1 1 + 2
Question c) Consider P (Tmin > t|Xmin = 2) = P (T1 > t|T2 > T1 ) = P (T1 > t, T2 > T1 ) P (T1 > t, T2 > T1 ) = P (T2 > T1 ) P (Xmin = 2)
We evaluate the probability in the denominator by integrating the joint density over a proper region (page 349), similarly to example 2 page 352 P (T1 > t, T2 > T1 ) =
t t1
1 e1 t1 2 e2 t2 dt2 dt1
2 =
t
1 e1 t1 e2 t1 dt1 =
1 e(1 +2 )t 1 + 2
By inserting back we nally get P (Tmin > t|Xmin = 2) = e(1 +2 )t = P (Tmin > t) such that Tmin and Xmin are independent. Question d) We can dene Xmin = i whenever Tmin = Ti . Then P (Xmin = i) = i , and Tmin and Xmin are independent. 1 ++n
1 IMM - DTU
02405 Probability 2004-5-13 BFN/bfn
1 x 2e2y e y dy y
a non-standard density. Question b) Using average conditional expectation page 425 bottom we get E(X) = E(E(X|Y )) = E(Y ) = noting that the roles of X and Y are interchanged. Question c) Similarly E(XY ) = E(E(XY |Y )) = E(Y E(X|Y )) = E(Y 2 ) = V ar(Y ) + (E(Y ))2 = 1 2 1 2
We have E(X 2 ) = E(E(X 2 |Y )) = E(2Y 2 ) = 1. Thus V ar(X) = SD(X)2 = 1 1 3 1 4 1 1 = 4 and SD(Y ) = 2 . Finally Corr(X, Y ) = 2 3 1 = 33 4
2 2