Probality and Statics
Probality and Statics
Probality and Statics
INTRODUCTION TO
PROBABILITY
1
Definition 1.3 (Event). The event is any subset of finite sample
space S
Example 1.1 (Rolling a die). The sample space, S, when we
role a die is
S = {1, 2, 3, 4, 5, 6}.
Example 1.2 (Tossing a coin). The sample space, when we toss
a coin is
S = {H, T }, H: Head, T : Tail.
Example 1.3 (Rolling a die and tossing a coin). The sample
space can be written in the form
S ={1, 2, 3, 4, 5, 6} × {H, T }
={(1, H), (2, H), . . . , (6, H), (1, T ), (2, T ), . . . , (6, T )}.
Example 1.4. If E is the experiment of tossing a coin until
the head occurs, and hence by calculating the number of times
tossing the coin, we get
S = {1, 2, 3, . . . , ∞}.
Example 1.5 (Tossing 3 coins or equivalently tossing one coin
3 times).
S ={H, T } × {H, T } × {H, T }
={HHH, HHT, HT T, HT H, T HH, T T H, T HT, T T T }.
Note that, for simplicity we write, for example, (H, T, H) ≡
HT H. The number of prime events also in S is equal to 23 .
Remark 1.1.
1. If the event consists of only one element then it is called
prime event.
2. If the event contains no elements, then it is called impossible
event.
2
3. If the event is the sample space itself, then it is called cer-
tain event.
4. For example, A = {2}, B = {5} are considered prime
events in S of Example (1.1), but D = {3, 4, 7} is not an
event in this sample space since 7 is not an element in S.
From the last definitions, one can notice that the sample
space may be finite as in Examples (1.1), (1.2) and (1.3), and
may be infinite as in Example (1.4).
Definition 1.4 (Complement event). The complement Ac of an
event A consists of all elements in S which are not in A.
3
In general, we say that the events A1 , A2 , . . . , An in a sample
space S are mutually exclusive events if
Ai ∩ Aj = ∅, i ̸= j, i, j = 1, 2, . . . , n.
(i) A ∪ B = {1, 2, 3, 4, 5, 6} = S.
(ii) A ∩ B = ∅.
(iii) Ac ∩ C = {6}.
(iv) From (i) and (ii), we conclude that A and B represent a
partition to S.
Definition 1.7 (Probability function). Let S be a sample space,
a probability function P (.) is a set function with domain D (an
algebra of events) and counterdomain the interval [0,1] which
satisfies the following axioms:
4
A1 0 ≤ P (A) ≤ 1 ∀A ∈ D.
A2 P (S) = 1
A3 If A1 , A2 , A3 , . . . , is a sequence of mutually exclusive events
in D, then (∞ )
∪ ∑∞
P Ai = P (Ai ),
i=1 i=1
where P (A) is read the probability of the event A.
A1 , A2 and A3 are sometimes called the axioms of probability.
5
B = A ∪ (B − A), then
P (B) = P (A) + P (B − A),
but since 0 ≤ P (A − B) ≤ 1, then P (A) ≤ P (B), and this
proves (iii).
By writing A as a union of two disjoint events, so
A = (A − B) ∪ (A ∩ B), then
P (A) = P (A − B) + P (A ∩ B), and hence
P (A − B) ≡ P (A ∩ B c ) = P (A) − P (A ∩ B), and this proves
(iv).
Similarly, one can prove that P (B − A) = P (B) − P (A ∩ B).
Now, by writing A ∪ B as a union of three mutually exclusive
events as follows
A ∪ B = (A − B) ∪ (A ∩ B) ∪ (B − A)
and hence
P (A ∪ B) =P (A − B) + P (B − A) + P (A ∩ B)
=P (A) − P (A ∩ B) + P (B) − P (A ∩ B) + P (A ∩ B)
=P (A) + P (B) − P (A ∩ B),
and this proves (v).
Example 1.7. Free two balanced coins are tossed simultane-
ously. Let A = {HH}, B = {HT, T H} and C = {HH, T T, HT }
be are three events in the sample space S of this experiment.
Find
(i) P (A ∪ B) (ii) P (A ∩ C) (iii) P (C − A)
6
(iii) P (C − A) = P (C) − P (A ∩ C) = 0.75 − 0.25 = 0.25.
Example 1.8. Prove that if P (Ac ) = α and P (B c ) = β, then
P (A ∩ B) ≥ 1 − α − β.
Proof. Since P (Ac ) = 1 − P (A), then α = 1 − P (A) ⇒ P (A) =
1 − α.
Similarly, we can get P (B) = 1 − β, and hence
P (A ∩ B) =P (A) + P (B) − P (A ∪ B)
=1 − α + 1 − β + (−P (A ∪ B)).
Since 0 ≤ P (A ∪ B) ≤ 1 ⇒ −P (A ∪ B) ≥ −1, then
P (A ∩ B) ≥ 1 − α + 1 − β − 1 = 1 − α − β.
Example 1.9. If P (A ∩ B c ) = 0.2 and P (B c ) = 0.7. Find
P (A ∪ B).
∵ P (B c ) = 1 − P (B) ⇒ 0.7 = 1 − P (B) ⇒ P (B) = 0.3,
∵ P (A) = P (A ∩ B) + P (A ∩ B c ),
∴ P (A) − P (A ∩ B) = P (A ∩ B c ) = 0.2,
∵ P (A ∪ B) = P (A) + P (B) − P (A ∩ B),
∴ P (A ∪ B) = 0.2 + 0.3 = 0.5.
Example 1.10. A point is selected at random inside an equilat-
eral triangle whose side length is 3. Find the probability that its
distance to any corner is greater than 1.
Let A denote the set of points inside the shaded part con-
tained in the triangle, and so A consists of all points that its
distance to any corner is greater than 1, and let S denote the
set of points inside the triangle whose side length is 3.
Area of A = area of the triangle
− 3 (area of any sector in this triangle)
(π ) ( )
1 1 π
= × 3 × 3 × sin −3 × 12 ×
2 3 2 3
9 √ π
= 3− .
4 2
7
area of A
Since P (A) = ,
area of S
√
(9/4) 3 − π/2
⇒ P (A) = √ .
(9/4) 3
8
Multiplication Rule of Probabilities
If A1 , A2 , . . . , An are n events in a sample space S, then
P (A1 ∩ A2 , ∩, . . . , ∩An ) =P (A1 )P (A2 | A1 )P (A3 | A1 ∩ A2 ) . . .
P (An | A1 ∩ A2 ∩, . . . , ∩An−1 ).
Example 1.11. Two different digits are selected at random from
the digits 1 through 9.
(i) If the sum is even, find the probability that both numbers are
odd?
(ii) If the sum is odd, what is the probability that 2 is one of
the numbers selected?
(iii) If 2 is one of the digits selected, what is the probability that
the sum is odd?
Let’s solve each part of the problem step by step:
(i) If the sum is even, find the probability that both numbers
are odd?
To find the probability that both numbers are odd when the
sum is even, we first need to determine the total number of ways
to select two different digits from the digits 1 through 9.
There are 9 choices for the first digit and 8 choices for the
second digit (since it must be different from the first one). So,
there are 9 * 8 = 72 possible pairs of digits.
Now, we need to find the pairs where both numbers are odd
and the sum is even. The odd digits in the range 1 through 9
are 1, 3, 5, 7, and 9. To get an even sum, we can have pairs like
(1, 3), (1, 5), (1, 7), (3, 5), (3, 7), and (5, 7).
There are 6 such pairs, and for each pair, there are 2 ways to
arrange the digits (e.g., (1, 3) and (3, 1)). So, there are a total
of 6 * 2 = 12 pairs where both numbers are odd, and the sum
is even.
9
The probability of selecting one of these 12 pairs out of the
72 possible pairs is:
Probability = (Number of favorable outcomes) / (Total num-
ber of outcomes) Probability = 12 / 72 Probability = 1/6
So, the probability that both numbers are odd when the sum
is even is 1/6.
(ii) If the sum is odd, what is the probability that 2 is one of
the numbers selected?
To find the probability that 2 is one of the numbers selected
when the sum is odd, we again need to determine the total
number of ways to select two different digits from the digits 1
through 9, which is 72.
Now, we want to find the pairs where the sum is odd and one
of the numbers is 2. The odd digits are 1, 3, 5, 7, and 9, and we
want to pair 2 with one of these odd digits to get an odd sum.
There are 5 possibilities for the odd digit to pair with 2. For
each of these 5 cases, there are 2 ways to arrange the digits (e.g.,
(2, 1) and (1, 2)).
So, there are a total of 5 * 2 = 10 pairs where 2 is one of the
numbers selected, and the sum is odd.
The probability of selecting one of these 10 pairs out of the
72 possible pairs is:
Probability = (Number of favorable outcomes) / (Total num-
ber of outcomes) Probability = 10 / 72 Probability = 5/36
So, the probability that 2 is one of the numbers selected when
the sum is odd is 5/36.
(iii) If 2 is one of the digits selected, what is the probability
that the sum is odd?
To find the probability that the sum is odd when 2 is one of
the digits selected, we can consider the cases where 2 is paired
with an odd digit (1, 3, 5, 7, or 9) to form a sum.
We already determined in part (ii) that there are 10 such
pairs where 2 is one of the numbers selected, and the sum is
10
odd.
The probability of selecting one of these 10 pairs out of the
total possible pairs with 2 is:
Probability = (Number of favorable outcomes) / (Total num-
ber of outcomes) Probability = 10 / 72 Probability = 5/36
So, the probability that the sum is odd when 2 is one of the
digits selected is also 5/36.
To summarize:
(i) Probability that both numbers are odd when the sum is
even: 1/6
(ii) Probability that 2 is one of the numbers selected when
the sum is odd: 5/36
(iii) Probability that the sum is odd when 2 is one of the
digits selected: 5/36
Example 1.12. A box contains 8 red, 3 white and 9 blue balls.
Three balls are drawn at random without replacement from this
box. Find the probability that
(i) the three balls are red,
(ii) two balls are red and one is white,
(iii) at least one ball is white,
(iv) one from each color in the drawn balls,
(v) the drawn balls are accomplished according to the following
order (red, white, blue).
(i) Let Ri , i = 1, 2, 3, denote the event that the ith drawn ball
is red, W denote the event that the drawn ball is white and
let B denote the event that the drawn ball is blue, then
P (R1 ∩ R2 ∩ R3 ) =P (R1 )P (R2 | R1 )P (R3 | R1 ∩ R2 )
8 7 6 14
= × × = .
20 19 18 285
C38 14
Or P (R1 ∩ R2 ∩ R3 ) = 20 = .
C3 285
11
(ii)
C317
(iii) P (there is no white ball) = , then
C320
(iv)
8 3 9 3
(v) P (R ∩ W ∩ B) = × × = .
20 19 18 95
Definition 1.9 (Independence). The two events A and B in
a sample space S are said to be independent if P (A ∩ B) =
P (A) · P (B).
12
One can show that if A and B are independent, then
P (A | B) = P (A).
P (Ac ∩ B c ) =P (A ∪ B)c = 1 − P (A ∪ B)
=1 − [P (A) + P (B) − P (A ∩ B)]
=1 − P (A) − P (B) + P (A)P (B)
=P (Ac ) − P (B)[1 − P (A)]
=P (Ac ) − P (B)P (Ac )
=P (Ac )[1 − P (B)]
=P (Ac )P (B c ).
13
Theorem 1.2 (Total probability). Suppose that the events A1 , A2 , . . . , An
form a partition of a sample space S and let B be another event
in S, then
∑ n
P (B) = P (Ai ) P (B | Ai ).
i=1
Proof.
The event B can be written, as a union
of mutually exclusive events, see Fig. (1.5),
in the form
B = (B ∩ A1 ) ∪ (B ∩ A2 ) ∪ · · · ∪ (B ∩ An ).
∴ P (B) = P (B ∩ A1 ) + P (B ∩ A2 ) + · · · + P (B ∩ An )
∑n
= P (B ∩ Ai ).
i=1
But, since P (B ∩ Ai ) = P (Ai ) P (B | Ai ), then
∑
n
P (B) = P (Ai ) P (B | Ai ).
i=1
14
black. The answer at this question leads us to the following
theorem.
Theorem 1.3 (Bayes’ formula). If the events A1 , A2 , . . . , An
form a partition to S and B is another event in S, then
P (Ai ) P (B | Ai )
P (Ai | B) = ∑n , i = 1, . . . , n
i=1 P (Ai ) P (B | Ai )
Proof.
P (Ai ∩ B) P (Ai ) P (B | Ai )
P (Ai | B) = =
P (B) P (B)
P (Ai ) P (B | Ai )
= ∑n .
i=1 P (Ai ) P (B | Ai )
15
P (E | Ac ) = 0.002, and hence
P (A)P (E | A)
P (A | E) =
P (A)P (E | A) + P (Ac )P (E | Ac )
0.001 × 0.999
=
0.001 × 0.999 + 0.999 × 0.002
=0.333.
Example 1.18. A box contains 10 balls of which 3 are black
and 7 are white. The following game is played: At each trial a
ball is selected at random, its color is noted, and it is replaced
along with two additional balls of the same color. What is the
probability that a black ball is selected in each of the first three
trials?
Let Bi denote the event that a black ball is selected on the
th
i trial. By the multiplication rule,
16
Example 1.20. Three boxes contain balls. The first contains 10
white and 5 black balls, the second contains 7 white and 8 black
balls and the third contains 5 white and 10 black balls. One
box is chosen at random and then two balls are drawn, without
replacement, from this box and turn out to be has the same color.
What is the probability that it comes from the second box.
Let Ai denote the event of drawing box i, i = 1, 2, 3, and B
be the event of drawing two balls of the same color. Since the
box is chosen at random, then
P (A1 ) = P (A2 ) = P (A3 ) = 1/3, and hence
P (B) =P (A1 )P (B | A1 ) + P (A2 )P (B | A2 ) + P (A3 )P (B | A3 )
[ ] [ ] [ ]
1 C210 + C25 1 C27 + C28 1 C25 + C210
= + +
3 C215 3 C215 3 C 15
[ ] [ ]2
1 10 × 9 + 5 × 4 1 7×6+8×7
= +
3 15 × 14 3 15 × 14
[ ]
1 5 × 4 + 10 × 9
+ = 0.504.
3 15 × 14
P (A2 )P (B | A2 ) 0.15
∴ P (A2 | B) = = = 0.31.
P (B) 0.504
Example 1.21. In a multiple-choice test, assume that there are
five multiple-choice available to each question. Let p be the prob-
ability that a student knows the answer and q = 1 − p the proba-
bility that the student guesses. Assume also that the probability
that the student gets the right answer given that he guesses is
0.2. If the student got the right answer, what is the probability
that the student knew the right answer indeed.
Let A denote the event that the student got the right answer
and B denote the event that the student knew the right answer.
Using Bayes’ formula, we have
P (A | B)P (B) 1×p
P (B | A) = = .
P (A | B)P (B) + P (A | B c )P (B c ) 1 × p + .2 × q
17
Exercises I
In the following, suppose that A, B and C are events in a
sample space S.
1. If P (A) = 0.2, P (B) = 0.4, P (A∪B) = 0.5. Determine
(i) P (A ∩ B), (ii) P (Ac ∩ B), (iii) P (A ∩ B c ),
(iv)P (Ac ∩ B c ), (v) P (A | B), (vi) P (B c | Ac ).
2. Show that
P (A ∩ B) − P (A)P (B)
P (A | B) − P (A | B c ) =
P (B)P (B c )
18
10. A bond issue for the construction of a new public library is
before the voters. A poll showed that 85% of those with a
college education favored the construction of a new library,
but only 20% of those not having a college education did
so. Suppose that 90% of the voting population do not have
a college education. What is the probability that a voter
selected at random who favors the bond issue will be one
with a college education? [.32]
11. A certain cancer diagnostic test is 95% accurate on those
that do have cancer, and 90% accurate on those that do not
have cancer. If 1/2% of the population actually does have
cancer, compute the probability that a particular individual
has cancer if the test finds that he has cancer.
12. In Polya’s urn scheme, an urn initially contains r red balls
and b black balls. At each trial a ball is selected at random,
its color is noted and it is replaced along with c additional
balls of the same color. What is the probability that one
obtains a red ball in each of first three trials?
13. Three machines A, B and C produce respectively 60%, 30%
and 10% of the total number of items of a factory. The
percentages of defective output of this machines are respec-
tively 2%, 3% and 4%. An item is selected at random and
is found defective. Find the probability that the item was
produced by machine A or B? [21/25]
19
Chapter 2
RANDOM VARIABLES,
PROBABILITY FUNCTIONS
AND EXPECTATIONS
a random variable.
i.e. X : S 7→ R; X(s) = x.
20
RX = {x : X(s) = x; s ∈ S}.
then
S ={HH, HT, T H, T T }
={s1 , s2 , s3 , s4 }
0, Y (s4 ) ≡ Y (T T ) = 2, then
Similarly,
P (X = 0) = P {s ∈ S : X(s) = 0} = P {T T } = 1/4.
21
P (Y = 2) = P {s ∈ S : Y (s) = 2} = P {HH, T T } = 1/2.
(i) p(x) ≥ 0 ∀ x ∈ RX
∑∞
(ii) −∞ p(x) = 1.
22
{0, 1, 2} and RY = {0, 2} and the functions P (X = x) ≡ p(x)
satisfied.
satisfied
(i) f (x) ≥ 0, ∀x ∈ RX
∫∞
(ii) −∞ f (x) dx = 1.
Remark 2.2. Condition (i), given above, means that the curve
of f (x) lies entirely above the x-axis, but condition (ii) means
equal 1.
23
Remark 2.3. The relation between the probability of an event
ability functions
1− | 1 − x | ; 0 < x < 2,
(i) f(x)=
0; otherwise.
{ ( )2 }
1 1 x−µ
(ii) f (x)= √ exp − ; −∞ < x < ∞,
σ 2π 2 σ
. (−∞ < µ < ∞, σ > 0)
x
e−2 2 ; x = 0, 1, 2, . . . ,
(iii) f(x)= x!
0; otherwise.
1 − x ; x < 1,
| 1 − x |= 0; x = 1,
x − 1 ; x > 1.
24
Then we can write f (x) in the form
x; 0 < x < 1,
f (x) =
2 − x ; 1 ≤ x < 2.
25
(2y)−1/2 dy, and hence
∫ ∞ ∫ ∞
2 1 Γ(1/2)
I=√ (2y)−1/2 e−y dy = √ y −1/2 e−y dy = √ = 1,
2π 0 π 0 π
have
∫ ∞ √
xn−1 exp(−x/β)dx = Γ(n).β n , Γ(1/2) = π.
0
∑∞ n
(iii) We know before that n=0 x /n! = ex , then
∞
∑ ∑∞
−2 2x −2 2x
e =e = e−2 .e2 = 1,
x=0
x! x=0
x!
26
Definition 2.5 (Variance). Let X be a random variable having
of g(X) is defined by
∫
∞ g(x)f (x)dx; X : Continuous,
−∞
E[g(X)] =
∑ g(xi )p(xi );
X : Discrete.
i
Special Cases
27
X, i.e. the first moment of X about the origin is itself the
expected value of X.
E[g(X)] = E[X − µX ]r = µr
∫
∞ (x − µX )r f (x)dx; X : Cont.,
−∞
=
∑ (xi − µX )r p(xi ); X : Disc.,
i
variance of X.
the values of the random variable X are centered, but the vari-
of gravity.
28
Definition 2.7 (Standard deviation). The standard deviation
√
of a random variable X, denoted by σX , is defined as + V [X].
V [c1 g1 (X1 )+c2 g2 (X2 )] = c21 V [g1 (X1 )]+c22 V [g2 (X2 )]; X1 and
X2 are independent.
29
Proof.
= E[X 2 ] − (E[X])2 .
30
i.e.
µ3
γ1 = 3
.
( σ )
mean − median
The quantity provides an alternative
standard deviation
measure of skewness.
its center.
A man selected 3 items from this box, find the expected number of
x 0 1 2
32
The random variable here has the hypergeometric distribution
The reader may be left (ii) until he knows the binomial dis-
one toss occurs if head occurs the first time, two tosses occur
if the first is tail and the second is head. Three tosses occur if
the first two are tails and the third is head. Four tosses occur if
either T T T H or T T T T occurs. So
1 1
p(1) = P {H} = , p(2) = P {T H} =
2 4
1
p(3) = P {T T H} = ,
8
1 1 2
p(4) = P {T T T H} + P {T T T T } = + = ,
16 16 8
then
1 1 1 2 15
E =1× +2× +3× +4× = .
2 4 8 8 8
Sometimes we denote the probability mass function p(x) by
f (x).
33
Exercises II
3. (a) Find the value of the constant C for the following func-
(b) If X has the pdf given in (a), find the cdf, F (x), and
then calculate
i. P (X > 3).
to be probability functions
−8 −4
(b) f (x) = C ; x = 0, 1, 2, 3, 4, 5, 6.
x 6−x
34
C ex
(c) f (x) = , −∞ < x < ∞.
2(1 + ex )2
√
1
; | x |< 1,
(d) f (x) = π 1 − x2
0; otherwise.
x+2
7. Let X be a random variable have pdf f (x) = ; −2 <
18
x < 4, zero elsewhere. Find E[X], E[(X +2)3 ], and E[6X −
2(X + 2)3 ].
35
defective one is obtained. Find the expected number of
10. A fair coin is tossed until a head appears, Let X denote the
follows
∫ ∞
(x − αβ)r α−1 −x/β
µr = E[(X − αβ) ] =r
x e dx.
0 Γ(α)β α
[ ]
dµ r
(a) Prove that µr+1 = β 2 α r µr−1 + , r = 1, 2, . . .
dβ
(b) Use the fact that µ0 = 1, µ1 = 0 and the differen-
36
Chapter 3
SOME IMPORTANT
DISCRETE DISTRIBUTIONS
37
vidual trial, then this random experiment is called a binomial
probability q = 1 − p.
38
2. The experiment is repeated n independent trials.
One can now notice that the probability given by the binomial
placement.
Proof.
∑
n
E[X] = xp(x)
x=0
∑n
n!
= x px q n−x , q =1−p
x=0
x!(n − x)!
∑n
n (n − 1)!
= x p px−1 q n−x
x=0
x(x − 1)!(n − x)!
∑
n
(n − 1)!
= np px−1 q n−x
x=1
(x − 1)!(n − x)!
= np(p + q)n−1 = np
39
∑
n
2
E[X ] = x2 p(x)
x=0
∑n
= [x(x − 1) + x]p(x)
x=0
∑n
= x(x − 1)p(x) + E[X]
x=0
∑n
n (n − 1)(n − 2)!
= x(x − 1) p2 px−2 q n−x + np
x=2
x(x − 1)(x − 2)!(n − x)!
∑
n
(n − 2)!
= n(n − 1)p 2
px−2 q n−x + np
x=2
(x − 2)!(n − x)!
= npq.
Now,
∑
n
tX
mX (t) = E[e ] = etx p(x)
x=0
∑
n
= etx Cxn px q n−x
x=0
∑n
= Cxn (pet )x q n−x
x=0
= (pet + q)n .
40
Example 3.1. The probability that a patient recovers from a
Let X ∼ b(15, 0.4) and shows the number of people that survive.
Then
(a)
∑
9
P (X ≥ 10) = 1 − P (X < 10) = 1 − Cx15 0.4x 0.615−x
x=0
= 1 − 0.9662 = 0.0338.
(b)
∑
8
P (3 ≤ X ≤ 8) = Cx15 0.4x 0.615−x
x=3
(c)
41
Let X ∼ b(5, 0.25) and shows the number of escaping pheas-
ants, then
∑
5
P (X ≥ 3) = Cx5 .25x 0.755−x .
x=3
∑
5 ( )x ( )9−x
1 2
P (µ − 2σ < X < µ + 2σ) = Cx9 .
x=1
3 3
( )9
2 1 t
The moment generating function + e is the moment
3 3
generating function of the binomial distribution with parameters
√
n = 9, p = 1/3, then µ ≡ E[X] = np = 3, σ ≡ V [X] =
√ √
npq = 2 and hence
√ √
P (µ − 2σ < X < µ + 2σ) = P (3 − 2 2 < X < 3 + 2 2)
42
If x = r is the only mode, then it must satisfy P (X = r+1) <
It follows that
Similarly,
n
Cr−1 pr−1 q n−r+1 rq
< 1 ⇒ < 1 ⇒ r(1 − p) < (n − r + 1)p.
Crn pr q n−r (n − r + 1)p
It follows that
1) = 5/9. Find P (Y ≥ 1)
43
3.2 Poisson Distribution
44
this route in a certain week? What is the probability of occurring
two weeks.
(i)
e−5 50
P (X = 0) = = e−5 ,
0!
(ii)
∑
4
e−5 5x
P (X ≤ 4) = = 0.4405
x=0
x!
hence
∑
2
e−10 10x
P (X > 2) = 1 − P (X ≤ 2) = 1 − = 0.9972.
x=0
x!
45
Suppose that X is a random variable subjects to Poisson dis-
220
tribution, with parameter λ = = 1.1, and shows the number
200
of misprints in a given page, then
e−1.1 1.10
(i) P (X = 0) = = e−1.1
0!
(ii)
P (X ≥ 2) = 1 − P (X < 2) = 1 − [P (X = 0) + P (X = 1)]
( )
−1.1 e−1.1 1.11
=1− e + = 0.334.
1!
Theorem 3.2. If the random variable X subjects to Poisson
m(t) = e−λ(1−e )
t
Proof.
∞
∑ ∞
∑ xe−λ λx
E[X] = xp(x) =
x=0 x=0
x!
∑∞
xe−λ λλx−1
=
x=1
x(x − 1)!
∑∞
e−λ λx−1
=λ = λ.
x=1
(x − 1)!
46
∞
∑
2
E[X ] = x2 p(x)
x=0
∑∞
[x(x − 1) + x]e−λ λx
=
x=0
x!
∑∞
[x(x − 1)]e−λ λx
= + E[X]
x=0
x!
∑∞
x(x − 1)e−λ λ2 λx−2
= +λ
x=2
[x(x − 1)](x − 2)!
∞
∑
2 e−λ λx−2
=λ +λ
x=2
(x − 2)!
= λ2 + λ.
Since V [X] = E[X 2 ] − (E[X])2 , then
V [X] = λ2 + λ − λ2 = λ.
∞
∑ ∞
∑ ∞
∑
tx etx e−λ λx −λ (λet )x
m(t) = e p(x) = =e
x=0 x=0
x! x=0
x!
{ }
−λ λet
=e e
= e−λ(1−e ) .
t
47
to zero. Hence if these two conditions hold, the Poisson dis-
without replacement.
48
N N − N1
1
x n−x
, x = 0, 1, . . . , n,
P (X = x) ≡ p(x) = N
n
0, otherwise.
where N is a positive integer, N1 is a nonnegative integer; n1 ≤
N1 N1 N − N1 N − n
E[X] = n and V [X] = n .
N N N N −1
49
(N − n)
ance of the hypergeometric distribution is times the
(N − 1)
variance of the binomial distribution.
A man selected 3 items from this box. Find the distribution and
2 6
C C
x 83−x , x = 0, 1, 2,
P (X = x) = C3
0, otherwise.
N1 3×2 3
We know before that E[X] = n = = .
N 8 4
If the draw occurred with replacement, then X will subject
N1 2
to the binomial distribution with n = 3, p = = and hence
N 8
the distribution of X takes the form
50
( )x ( )3−x
2 6
Cx3 , x = 0, 1, 2, 3
P (X = x) = 8 8
0, otherwise.
2 3
In this case, E[X] = np = 3 × = .
8 4
Number (i) of this example was solved before in Example
(2.4), but (ii) was left to be solved after studying the binomial
distribution.
tribution with p = N1 /N
Remark 3.4. The last theorem tells us that for very large N
Example 3.9. A large box contains 150 white mice and 35 gray
51
Let X ∼ H(5, 150, 185) and shows the number of white mice
on white telephones is
52
Example 3.11. Two dice are thrown 100 times and the number
P (X ≥ 3) = 1 − P (X ≤ 2)
[ 2 ( )x ( )100−x ]
∑ 1 8
=1− Cx100
x=0
9 9
= 0.9993.
53
Exercises III
deviation is 3.
deviation is 2.
deviation is 4.
what is P [X = 1 or 2]?
54
7. Find the mode of Poisson distribution with parameter λ.
these questions.
55
can be approximated by a Poisson model. What is the
bacteria?[1 − 5e−4 ]
56
Chapter 4
SOME IMPORTANT
CONTINUOUS
DISTRIBUTIONS
57
1
; −∞ < a ≤ x ≤ b < ∞,
f (x) = b−a
0; otherwise.
Theorem 4.1. If X is uniformly distributed over [a, b], then
a+b (b − a)2 ebt − eat
E[X] = , V [X] = and mX (t) = .
2 12 (b − a)t
Proof.
∫ ∞ ∫ b
x x2 b
E[X] = x f (x) dx = dx =
−∞ a b−a 2(b − a) a
b −a
2 2
b+a
= = .
2(b − a) 2
∫
2
b
x2 b3 − a3 b2 + ab + a2
E[X ] = = = ,
a b − a 3(b − a) 3
V [X] = E[X 2 ] − (E[X])2
b2 + ab + a2 (a + b)2
= −
3 4
4b + 4ab + 4a − 3a2 − 6ab − 3b2
2 2
=
12
b − 2ab + a
2 2
(b − a)2
= = .
12 ∫ 12
b
tX etx
mX (t) = E[e ] = dx
a b−a
etx b
=
(b − a)t a
ebt − eat
= .
(b − a)t
58
Example 4.1. Suppose X is a continuous random variable with
P [X < 0]?
then
a+b
E[X] = =1 (4.1)
2
(b − a)2 4
V [X] = = . (4.2)
12 3
From (4.1) we can write a = 2 − b and then by substituting in
The first choice for a and b is rejected since a < b, thus the
59
σ, if its density function is given by
[ ( )2 ]
1 −1 x − µ
f (x) = √ exp , −∞ < x < ∞,
σ 2π 2 σ
(4.3)
If the random variable X is normally distributed with mean µ
60
3. The normal curve approaches the horizontal x-axes as x →
±∞
4. The total area under the curve and above the x-axes is
equal to 1.
variable Z with mean zero and variance one, using the transfor-
mation
(X − µ)
Z= ,
σ
(µ − µ) V (X − µ) σ 2
where E[Z] = = 0 and V [Z] = = 2 = 1.
σ σ2 σ
E[X] = µ, V [X] = σ 2 .
Proof.
∫ [ ( )2 ]
∞
1 −1 x−µ
I ≡ E[X] = √ x exp dx.
σ 2π −∞ 2 σ
x−µ
Using the transformation z = ⇒ x = σz +µ ⇒ dx = σdz,
σ
61
then
∫ ∞ ] [
1 −1 2
I= √ (σz + µ)exp z σdz
σ 2π −∞ 2
∫ ∞ ∫ ∞
−z µ
e−z
2 2
=σ ze /2
dz + √ /2
dz
−∞ 2π −∞
= µ,
∫∞
since −∞ ze−z /2 dz = 0; the integrated function is odd, and
2
∫ ∞ −z 2 /2
−∞ e dz = 1; Z ∼ N (0, 1).
62
(i)
( )
X − 18 15 − 18
P (X < 15) = P <
2.5 2.5
= P (Z < −1.2)
= Φ(−1.2) = 0.1151,
(ii)
( )
k − 18
P (X < k) = 0.2578 ⇒ P Z < = 0.2578
2.5
k − 18
⇒ = −.65 ⇒ k = 16.375,
2.5
(iii)
( )
17 − 18 21 − 18
P (17 < X < 21) = P <Z<
2.5 2.5
= P (−.4 < Z < 1.2)
= Φ(1.2) − Φ(−.4)
(iv)
( )
K − 18
P (X > k) = .1539 ⇒ P Z > = .1539
2.5
( )
18 − k
⇔P Z< = .1539
2.5
18 − k
⇒ = −1.02
2.5
⇒ k = 20.55,
63
Example 4.3. If a set of grades on a statistic examination are
the lowest 10% of the students are given F s, (b) the highest B
P (88 < X < 94) = P (1.2 < Z < 2.4) = Φ(2.4) − Φ(1.2) =
64
4.2.1 Normal approximation to the binomial
distribution of
X − np
Z= √ ,
npq
as n → ∞, is the standardized normal distribution N (0, 1).
to the following:
1.
P (X = c) = P (c − 0.5 ≤ X ≤ c + 0.5)
( )
c − 0.5 − np c + 0.5 − np
=P √ ≤Z≤ √
npq npq
( )
a − 0.5 − np b + 0.5 − np
2. P (a ≤ X ≤ b) = P √ ≤Z≤ √ ,
npq npq
3. P (a < X < b), P (a < X ≤ b) and P (a ≤ X < b) should be
(2).
65
Example 4.5. A drug manufacturer claims that a certain drug
cured, what is the probability that the claim will be accepted when
then
∑
100
P (X ≥ 75) = Cx100 (0.85)x (0.15)100−x ,
x=75
but we notice that the number of individuals is large, so it is
fore
( )
75 − 0.5 − (100)(0.85)
P (X ≥ 75) = P Z≥ √
(100)(0.85)(0.15)
= 0.9984.
66
unaccepted. Find also the mean and standard deviation of the
accepted pills.
pills, then
P (X ≥ 2) = 1 − [P (X = 1) + P (X = 0)] =
The mean and standard deviation of the accepted pills are equal,
√
respectively, n(1 − p) = 200(0.95) = 190 and n(1 − p)q =
√
200(0.95)(0.05) = 3.08.
67
if its density function is given by,
1
xn−1 e−x/β , x ≥ 0, (n, β > 0)
Γ(n)β n
f (x) = (4.4)
0, otherwise,
where Γ(.) is the gamma function.
n and β, then
Special Cases
1. Exponential distribution.
If in (4.4) n = 1, then
1
e−x/β , x ≥ 0, (β > 0)
f (x) = β
0, otherwise,
68
2. Chi-square distribution
m, denoted by χ2 (m)
69
Exercises IV
[.4516]
(c) α = 0.01.
70
9. Show that the graph of a pdf N (µ, σ 2 ) has points of inflec-
tion at x = µ − σ, x = µ + σ?
dµ2r
(a) µ2r+2 = σ 2 µ2r + σ 3 .
dσ
′
′ ′ ′
3 dµr
(b) µr+2 = 2µµr+1 + (σ − µ )µ2 + σ
2 2
.
dσ
71
14. The total time, T , taken to complete a certain job is a
β α α−1 −βt
gamma random variable with pdf f (t) = t e ; t≥
Γ(α)
0, zero elsewhere, where α = 4 and β = 1 hours. What
[.265]
72
Chapter 5
SAMPLING THEORY
lated by sampling.
73
5.1 Population and Samples
way that every possible sample of size n has the same probability
1∑ r
n
′
2. µr = X [rth sample moment about 0].
n i=1 i
1∑
n
3. µr = (Xi − X̄)r [rth sample moment about X̄].
n i=1
Definition 5.4 (Sample variance). If X1 , X2 , . . . , Xn represent
74
is defined to be the sample variance.
sample.
the statistic.
of the statistic.
1. When σ is known.
75
Let X1 , X2 , . . . , Xn be a random sample of size n drawn
σ 2 , then
2( )
σ N − n
; if the population is finite and the
n N −1
sampling is without replacement,
2
σX̄ =
σ2
; if the population is infinite or the
n
sampling is with replacement,
X̄ − µ
Z= √ ∼ N (0, 1).
σ/ n
76
2. When σ is unknown.
X̄ − µ
Z= √ ∼ N (0, 1).
S/ n
X̄ − µ
T = √ ∼ tν
S/ n
of Means
variance σ12 , and the second with mean µ2 and variance σ22 . Let
n2 drawn from the second population such that the values of X̄1
77
are independent of the values of X̄2 , then
σ12 σ22
µX̄1 ±X̄2 = µ1 ± µ2 and 2
σX̄ 1 ±X̄2
= + .
n1 n2
and n2 are drawn from two large populations with means µ1 and
Hence,
(X̄1 − X̄2 ) − (µ1 − µ2 )
Z= √ 2 ∼ N (0, 1).
σ1 σ22
n1 + n2
Example 5.2. If the uric acid values in normal adult males are
µ = 5.7, σ = 1, n=9
78
(a)
( )
X̄ − µ 6−µ
P (X̄ > 6) = P √ > √
σ/ n σ/ n
= P (Z > 0.9) = 1 − P (Z ≤ 0.9)
= 1 − 0.8159 = 0.1841.
(b)
the above population. Find the mean and variance of the sam-
pling distribution?
4!
The number of drawn samples is equal to C24 = = 6.
2! · 2!
79
Samples x̄ i x̄i fi x̄i fi x̄i 2 fi
(1,3) 2 1 2 1 2 4
(3,5) 4 distribution 4 5 1 5 25
(3,7) 5 5 6 1 6 36
(5,7) 6 6 24 106
∑
x̄i fi 24
E[X̄] = ∑i = = 4.
i f i 6
∑ 2 (∑ )2
x̄i fi x̄i fi 106 5
σX̄ = ∑
2 i
− ∑ i
= − 16 = .
i fi i fi 6 3
We can note that
1+3+5+7
µ= = 4 = µX̄ ,
4
σ2 N − n 5 2 5
= × = = σX̄
2
.
n N −1 2 3 3
and that for a second type of client the average home visit is 30
randomly visits 35 clients from the first and 40 for the second
80
group, what is the probability that the average length of home
µ1 = 45 µ2 = 30
σ12 = 15 σ22 = 20
n1 = 35 n2 = 40.
We don’t know here whether the two populations are normal or
not. But, since n1 > 30 and n2 > 30, then the difference between
= 1 − P (Z < 1.23)
= 1 − 0.8907 = 0.1093.
81
is the probability that a sample mean X̄ will fall in the interval
P (µX̄ − 1.9σX̄ < X̄ < µX̄ − 0.4σX̄ ) = P (−1.9 < Z < −0.4)
= 0.3446 − 0.0287
= 0.3159.
ance (S 2)
2 (n − 1)s2
χ = .
σ2
2 (n − 1)S 2
The distribution of X = is referred to as the chi-
σ2
square distribution with ν = n − 1 degrees of freedom.
82
Example 5.6. Find the probability that a random sample of size
(a)
( )
2 (n − 1)S 2 (n − 1)9.1
P (S > 9.1) = P >
σ2 σ2
( )
24 × 9.1
= P X2 >
6
= P (X 2 > 36.4) = χ224 (36.4) = 0.05.
(b)
( )
24 × 3.462 24 × 10.745
P (3.462 < S 2 < 10.745) = P < X2 <
6 6
= P (13.848 < X 2 < 42.98)
portion
83
can say that, in this example, p = 0.08. If a random sample
from which the sample was drawn, the true proportion who feel
this sample?
84
n = 75 p = 0.55
( ) ( )
35 P̂ − p (35/75) − p
P P̂ < =P √ <√
75 p(1 − p)/n p(1 − p)/n
( )
(35/75) − 0.55
=P Z< √
0.55 × 0.45/75
85
Exercises V
1. If each observation in a sample is multiplied by k, show that
replacement.
2
(ii) Verify that µX̄ = µ and σX̄ = σ 2 /n
(iii) Between what two values would you expect the middle
86
(i) The expected mean and standard deviation of the sam-
69 inclusive.
and 10.745.
87
Chapter 6
ESTIMATIONS
tion.
88
6.1 Methods of Estimation
ters.
89
X1 + X2 + · · · + Xn
∵ X̄ =
( )n ( ) ( )
X1 X2 Xn
∴ E[X̄] = E +E + ··· + E
n n n
µ µ µ
= + + · · · + = µ.
n n n
σ2 σ2 σ2 σ2 σ2
Also, V [X̄] = E[(X̄ − µ)2 ] = 2 + 2 + · · · + 2 = n 2 = .
n n n n n
Now,
∑
n
1∑
n
(Xi − µ) =
2
[(Xi − X̄) + (X̄ − µ)]2
i=1
n i=1
∑
n ∑
n
= (Xi − X̄) + 2(X̄ − µ)
2
(Xi − X̄)
i=1 i=1
+ n(X̄ − µ)2
∑n
i=1 (Xi − X̄)
2
= (n − 1) + 0 + n(X̄ − µ)2
(n − 1)
= (n − 1)S 2 + n(X̄ − µ)2
i=1
n
(n − 1)σ 2 = (n − 1)E[S 2 ]
∴ E[S 2 ] = σ 2 ,
90
6.2 Confidence Intervals
of the interval.
ter θ such that the distribution of this quantity does not depend
P (Q1 ≤ Q(Θ̂, θ) ≤ Q2 ) = 1 − α.
then
Q1 ≤ Q(Θ̂, θ) ≤ Q2 ⇐⇒ T1 ≤ θ ≤ T2 .
where T1 and T2 are called the lower and upper limits, respec-
91
6.2.1 Confidence interval for the population mean (µ)
[σ known]
X̄ − µ
Z= √ ≡ Q(X̄, µ)
σ/ n
X̄ − µ
P (−zα/2 < Z < zα/2 ) = P (−zα/2 < √ < zα/2 ) = 1 − α
σ/ n
( )
σ σ
P −zα/2 √ < X̄ − µ < zα/2 √ =1−α
n n
( )
σ σ
P X̄ − zα/2 √ < µ < X̄ + zα/2 √ =1−α
n n
Theorem 6.1. A (1 − α)100% confidence interval for µ, based
σ σ
X̄ − zα/2 √ < µ < X̄ + zα/2 √ , (6.1)
n n
92
6.2.2 Confidence interval for the population mean (µ)
[σ unknown]
from a normal population with variance 100, find 90%, 95% and
93
X̄ = 90, σ 2 = 100, n = 49.
σ σ
X̄ − zα/2 √ < µ < X̄ + zα/2 √ .
n n
10 10
90 − 1.645 × < µ < 90 + 1.645 ×
7 7
Now,
So, 95% and 99% confidence intervals for µ are given, respec-
tively, by
10 10
90 − 1.96 × < µ < 90 + 1.96 × ,
7 7
10 10
90 − 2.6 × < µ < 90 + 2.6 × .
7 7
94
σ 2 = 9, e=1
sample with mean Ȳ and variance S22 taken from a normal pop-
ulation with mean µ2 and variance σ22 , then we have the following
cases:
2. Confidence interval for µ1 −µ2 when σ12 and σ22 are unknown.
95
60) and we cannot assume that σ12 = σ22 , then
√ √
2
S1 S2 2 S12 S22
(X̄−Ȳ )−zα/2 + < µ1 −µ2 < (X̄−Ȳ )+zα/2 +
n1 n2 n1 n2
µ2 .
known).
then given by
96
Exercises VI
97
Chapter 7
TESTS OF HYPOTHESES
98
hypothesis, denoted by H1 .
of two tails, one in left corresponds to θ < θ0 and the other one
significance.
99
7.1 Tests Concerning Means, Variances and
Proportions
rized as follows:
θ0 or θ ̸= θ0 .
or 0.01.
cal region.
of size n.
100
Test the hypothesis that µ = 20 against the alternative that
1. H0 : µ = 20.
2. H1 : µ ̸= 20.
3. α = 0.01.
x̄ − µ
4. Suppose that z = √ , so the critical region is
σ/ n
z < −zα/2 and z > zα/2
5. Computation: x̄ = 19.8, n = 50
x̄ − µ 19.8 − 20
zc = √ = √ = −2.828.
σ/ n 0.5/ 50
6. Conclusion: Reject H0 since zc < zt , and conclude that the
9.8, 9.9, 10.4, 10.3 and 9.8 ounces? Use a 0.01 level of signifi-
101
n = 10, µ = 10
1 ∑
10
100.6
x̄ = xi = = 10.06,
10 i=1 10
( 10 )2
1 ∑10 ∑
2
s = x −
2
xi =⇒ s = 0.245.
10 × 9 i=1 i i=1
1. H0 : µ = 10.
2. H1 : µ ̸= 10.
3. α = 0.01.
x̄ − µ
4. Suppose that t = √ , so the critical region is
s/ n
t < −tα/2 and t > tα/2
102
1. H0 : µ = 50.
2. H1 : µ < 50.
3. α = 0.05
x̄ − µ
4. Suppose that t = √ , so the critical region is
s/ n
t < −tα,n−1 =⇒ tt < −1.796.
x̄ − µ 42 − 50
5. Computation: tc = √ = √ = −2.32.
s/ n 11.9/ 12
1. H0 : σ 2 = 2500.
2. H1 : σ 2 ̸= 2500
3. α = 0.05
103
(n − 1)S 2
4. The test statistic to be used is X 2 = .
σ2
The critical region is
5. Computations:
(n − 1)s2 14 × 1225
χ2c = 2
= = 6.86.
σ 2500
104
1. H0 : µ1 − µ2 = 0
2. H1 : µ1 − µ2 ̸= 0
3. α = 0.05
5. Computations:
(120 − 96) − 0 24
tc = √ = = 1.88.
1 1 12.75
38.08 +
15 22
6. Conclusion: Accept H0 , since −2.0301 < tc < 2.0301.
105
1. H0 : p = 0.9, i.e. claim is correct
2. H1 : p < 0.9.
3. α = 0.01.
P̂ − p
4. The test statistic to be used is Z = √ .
p(1 − p)
n
The critical region is Z < −zα =⇒ Z < −2.33.
5. Computations:
0.8 − 0.9
zc = √ = −4.73.
(0.9)(0.1)
200
6. Conclusion: Reject H0 , since zc lies in the critical (rejec-
tion) region.
106
pooled estimate of p = p1 = p2 is used in computing σ̂P̂1 −Pˆ2 , the
78 90 90 + 78
p̂1 = , p̂2 = , p̄ = = 0.84
100 100 100 + 100
1. H0 : p2 − p1 ≤ 0
2. H1 : p2 − p1 > 0
107
3. α = 0.05.
5. Computations:
So, these data suggest that the new treatment is more ef-
108
pected frequencies is based on the quantity
∑
k
(oi − ei )2
2
χ = ,
i=1
ei
The critical region fall in the right tail of the chi-square dis-
Remark 7.1.
109
of ei < 5, it is best to have ei somewhat larger than 5 which
Grade A B C D F
f 14 18 32 20 16
100, then the expected value for each grade is ei = np, where
Grade A B C D F Total
oi 14 18 32 20 16
ei 20 20 20 20 20
3. α = 0.05.
110
4. The test statistic X 2 having chi-square distribution with
∑ (oi − ei )2
the value χ2 = 5i=1 is used.
ei
The critical region is X 2 > χ2(0.05,4) =⇒ X 2 > 9.488
5. computations:
∑
5
(oi − ei )2 200
χ2c = = = 10.
i=1
ei 20
obtained
1 2 3 4 5 6
oi 6 5 2 3 0 8
ei 4 4 4 4 4 4
Is it a fair die? Use 0.05 level of significance.
111
condition of ei will be satisfied as shown in the following table
1 or 2 3 or 4 5 or 6 T otal
oi 11 5 8 24
ei 8 8 8 24
(oi − ei )2 9 9 0 18
3. α = 0.05.
2
∑6 (oi − ei )2
4. We use in the calculations χ = i=1 .
ei
The critical region is χ2 > χ2(0.05,2) = 5.991.
5. Computations:
∑
6
(oi − ei )2 18
χ2c = = = 2.25.
i=1
ei 8
112
levels. Consider, for example, the factor A classified into n levels
∑
n
Cj = oij ; j = 1, . . . , m, and
i=1
∑
n ∑m
N= Ri = Cj ; i = 1, . . . , n j = 1, . . . , m,
i=1 j=1
AB B1 B2 . . . Bj . . . Bm Total
113
If the null hypothesis is satisfied (the two factors are inde-
Ri × Cj ∑ ∑ (oij − eij )2
n m
2
eij = and then χ = .
N i=1 j=1
eij
during a week.
Male Female
Over 25 hours 5 9
Under 25 hours 9 7
pendent.
pendent.
3. α = 0.01.
114
∑ ∑ (oij − eij )2
4. We use χ(α,ν)2 = i j ,
eij
where the critical region is χ2 > χ2α,ν = χ20.01,1 = 6.635.
5. Computations:
14 × 14 14 × 16
e11 = = 6.5 e12 = = 7.5
30 30
14 × 16 16 × 16
e21 = = 7.5 e22 = = 8.5
30 30
Total 14 16 30
∑
2 ∑
2
(oij − eij )2
χ2c =
i=1 j=1
eij
(5 − 6.5)2 (9 − 7.5)2 (9 − 7.5)2 (7 − 8.5)2
= + + +
6.5 7.5 7.5 8.5
= 0.538
115
on 180 individuals:
Smokers Smokers
Hypertension 21 36 30
No hypertension 48 26 19
cance.
Smokers Smokers
Total 69 62 49 180
of smoking habits.
3. α = 0.05.
116
∑ ∑ (oij − eij )2
4. We use χ(α,ν)2 = i j ,
eij
where the critical region is χ2 > χ2α,ν = χ20.05,2 = 5.991.
5. Computations:
87 × 69 93 × 69
e11 = = 33.35 e21 = = 35.65
180 180
87 × 62 93 × 62
e12 = = 29.96 e22 = = 32.03
180 180
87 × 49 93 × 49
e13 = = 23.68 e23 = = 25.31
180 180
6. Conclusion Reject H0 , since χ2c > 5.991 and hence the hy-
117
Exercises VII
1. Test the hypothesis that the average weight of containers
10.1, 9.8, 9.9, 10.4, 10.3, and 9.8 ounces. Use 0.01 level of
normal.
118
brand B?
x 1 2 3 4 5 6
.
f 28 36 30 36 23 27
workers was the same for the day, evening, or midnight shift
duced:
Shif t
Defective 45 55 70
cance.
119
REFERENCES
Graw-Hill.
120