CH9 Part I
CH9 Part I
CH9 Part I
CHAPTER SEVEN
In this section we look at some important special discrete random variables, namely the
Bernoulli, the binomial, the geometric, the negative binomial, the hypergeometric, the Poisson
and the multinomial distribution. These are all random variables that occur frequently in
practice.
A brief summary is given in the table below. The first four variables we look at all deal with
independent trials which have two possible outcomes – success or failure. It might help to think
of these as arising when we throw a coin. The hypergeometric arises when we sample without
replacement. Here again we are looking at trials which have two possible outcomes, but the
trials are not independent. The Poisson random variable arises from counting the number of
events that occur in a certain length of time or space.
Pge 1 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
success or failure – and a constant
probability of success on each trial but
the trials are not independent
Counts the number of events occurring This can be used to model
in time or space when events are customers arriving at a shop,
Definition (Bernoulli Trial): Bernoulli trial is an experiment with only two possible outcomes
where one is designated as success and the other as a failure.
Example: The experiment of tossing a coin can be considered as a Bernoulli trial because the
possible outcomes of the experiment are only two, namely Head and Tail. Hence, one can define
the success to be getting a head and the failure to be getting a tail.
Definition (Bernoulli Distribution): A random variable X is said to have a Bernoulli
distribution if the possible values of X are 0(failure) and 1(success) with probabilities
1 0 1 .
Range space: 0, 1
Parameter: 0, 1 .
Notation: ~ , ~ .
We say X is a Bernoulli random variable if X can only take the values 0 and 1. Suppose
1 , (so 0 1– , then we say that X has a Bernoulli distribution with
parameter θ, written ~ , ~ . The possible parameter values are
0, 1 .
The probability mass function of a Bernoulli random variable is given by:
1 ; 0,1
, 1 θ and 1
Page 2 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
The Bernoulli random variable is the most simple of all random variables. It is used when
looking at a Bernoulli trial – this is a trial which may have one of two possible outcomes. We
will label these outcomes as success or failure, though it is also possible to label them in other
ways such as heads or tails, true or false, one or zero. If θ is the probability of success and if we
let X equal one if the trial is a success and zero otherwise then ~ . Examples are:
9 I throw a coin once and let X be the number of heads. Then ~ where θ is the
probability of a head. (If the coin is fair then ~ 1/2 .)
9 I throw a die once and let X be the number of sixes. Then ~ where θ is the
probability of getting a six. (If the die is fair then ~ 1/6 .)
9 A patient has an operation. Suppose that X equals 1 if the patient dies and 0 if the patient
survives. Then ~ where θ is the probability of dieing. (Note that we are regarding
“dieing” as being a success. “Success” does not have to refer to a desired outcome.)
9 I choose an adult at random from all the adults in Ethiopia and I let X equal 1 if I have
chosen a man and 0 if I have chosen a women. Then ~ where θ is the proportion of
adults in Ethiopia who are male.
Consider a particular element of the sample space of an experiment E satisfying the condition
that X x (i.e. x successes). One such outcome could arise, for instance, if the first x repetitions
of the experiment resulted in success while the last n-x repetitions resulted in the occurrence of
failure. That is, the experiment has x successes and n-x failures. But exactly the same probability
Page 3 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
would be associated with any other outcome for which X x. The total number of such
n! ⎛n⎞
outcomes equals = ⎜⎜ ⎟⎟
x!(n − x)! ⎝ x ⎠
⎛n⎞
Accordingly, the probability of getting x successes in n trials would be ⎜⎜ ⎟⎟θ x (1 − θ ) n − x
⎝ x⎠
With this background, we know state a theorem that corresponds to the binomial distribution.
We shall now derive its probability mass function of a Bin n, θ random variable. First let us
look a simple case with n 3. In this case we can use the following sample space
S SSS , SSF , SFS , SFF , FSS , FSF , FFS , FFF .
Now let us calculate the probabilities connected with this sample space. For instance, to
calculate the probability of getting two successes note that the event {X=2} can be written as the
union of three mutually exclusive events so
P X 2 P SSF SFS FSS P SSF P SFS P FSS
Using the independence assumption we see that P SSF θθ 1 θ , P SFS θ 1 θ θ
and P FSS 1 θ θθ. Putting this altogether we obtain
P X 2 θθ 1 θ θ 1 θ θ 1 θ θθ 3θ 1 θ
Theorem (Binomial Distribution): Suppose we have a sequence of n independent Bernoulli
trials and let P Success θ and P failure 1 θ. Moreover, assume that the probability of
success remains the same from trial to trial. Let X be the number of success, then the expression
⎧⎛ n ⎞ x
⎪⎜⎜ ⎟⎟θ (1 − θ ) ; if x = 0,1, 2, ..., n
n− x
P ( X = x) = ⎨⎝ x ⎠
⎪0 Otherwise
⎩
is called a binomial distribution with parameters n and p, and it is written as X∼Bin n, θ .
Remark: When n=1, binomial distribution becomes Bernoulli distribution.
The binomial distribution is a legitimate probability distribution because
i) P X x 0 ∀x
n n
⎛n⎞ x
ii) ∑
x =0
P ( X = x ) = ∑ ⎜⎜ ⎟⎟θ (1 − θ ) n − x = (θ + 1 − θ ) n = 1
x =0 ⎝ x ⎠
In other words, a random variable is said to have binomial distribution if the experiment resulting
in the random variable satisfies the following four conditions.
& The same experiment is repeated a finite number of times, say n times,
& The probability of success remains the same from trial to trial,
Page 4 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
& Each trial is independent from the other,
& Each trial is a Bernoulli trial.
Example 1: What is the probability of obtaining 3 heads in four tosses of a fair coin?
Solution:
Let X be the random variable that stands for the number of heads observed in the four tosses.
Clearly, the set of possible values of the random variable X is Rx={0,1,2,3,4}, which is the range
space of the experiment. Moreover, the experiment is repeated n=4 times, the probability of
success(in our case Head) if fixed and it is ½, each trail is independent of the other, and finally
the experiment is a Bernoulli trial as it has only two outcomes ,Head and Tail. Therefore, the
defined random variable X has a Binomial distribution because it satisfies the four conditions
stated above. Thus,
3 4 −3
⎛n⎞ ⎛ 4 ⎞⎛ 1 ⎞ ⎛ 1 ⎞ 1
P[X=3]= ⎜⎜ ⎟⎟θ x (1 − θ ) n −x = ⎜⎜ ⎟⎟⎜ ⎟ ⎜ ⎟ =
⎝ x⎠ ⎝ 3 ⎠⎝ 2 ⎠ ⎝ 2 ⎠ 4
Example 2: The probability that a seed will germinate is 0.4. Determine the probability that out
of 5 seeds
will germinate.
Solution
Let X be the random variable that corresponds to the number of seeds germinated out of the five
seeds under consideration. Clearly, X has a binomial distribution with n=5 and θ =0.4. That is,
X∼Bin 5,0.4 .
Probability of none will germinate means
⎛n⎞ ⎛5⎞
P[X=0] = ⎜⎜ ⎟⎟θ x (1 − θ ) n − x = ⎜⎜ ⎟⎟(0.4) (0.6) = (0.6) 5 = 0.07776
0 5
⎝ x⎠ ⎝ 0⎠
⎛n⎞ ⎛5⎞
P[X=2]= ⎜⎜ ⎟⎟θ x (1 − θ ) n − x = ⎜⎜ ⎟⎟(0.4 ) (0.6) = (10)(0.16)(0.216) = 0.3456
2 5− 2
x
⎝ ⎠ 2
⎝ ⎠
At least one will germinate means X 1, i.e., 1,2,3,4, or 5. In other words, we need to find the
probability of not zero seeds that will germinate.
P X 1 1 PX 0 1 0 1 0.07776 0.92224
Example 3: A new medication gives 5% of the users an undesirable reaction. If samples of 13
users receive the medication, find the probability of
Page 5 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
Zero undesirable reactions, b) One undesirable reaction
Solution:
Let X be the number of persons having undesirable reaction. With the assumption of
independence, we can see that X is a random variable distributed according to binomial
distribution with parameters n=13 and p=0.05.
⎛n⎞ ⎛13 ⎞
P[X=0]= ⎜⎜ ⎟⎟θ x (1 − θ ) n − x = ⎜⎜ ⎟⎟(0.05) (0.95) = (0.95)13 = 0.513
0 13
x
⎝ ⎠ 0
⎝ ⎠
⎛n⎞ ⎛13 ⎞
P[X=1]= ⎜⎜ ⎟⎟θ x (1 − θ ) n − x = ⎜⎜ ⎟⎟(0.05) (0.95) = 0.351
1 12
⎝ x⎠ ⎝1 ⎠
Additional examples:
& I throw a coin five times and let X be the number of heads. If the results of the throws
are independent of each other then ~ 5, where θ is the probability of a head. (If
the coin is fair then ~ 5, 1/2 .)
& I throw a die seven times and let X be the number of sixes. If the results of the throws are
independent of each other then ~ 7, where θ is the probability of getting a six.
(If the die is fair then ~ 7, 1/6 .)
& 12 patients have an operation. Let X equals the number of patient who die. If the
operations are independent of each other and if each has the same probability of resulting
in a death, θ, then ~ 12, .
& A bag contains 20 balls – 8 are red and 12 are green – and I sample three balls at random.
If sampling is with replacement and X equals the number of times I choose a red ball then
~ 3, 0.4 .
It is an exercise to use it to show that (To be proved in the class):
, 1 1
INTRODUCTION
Geometric and Pascal distributions are very much related in that one can be obtained from the
other. Particularly, the Pascal distribution is a more general form of the geometric distribution.
The geometric distribution is famous for its application in waiting line problems.
The core question here is “How long should we wait for the first success to occur? “
Page 6 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
The geometric distribution arises when one needs to find the probability of encountering
failures before reaching at (or achieving) the first success in successive Bernoulli trials with a
known probability of success .
Definition (Geometric Distribution): A random variable X is defined to have geometric
distribution if the probability distribution of X is given by:
If the random variable represents the number of Bernoulli trials up to and including that in
which the first success occurs. Then the probability distribution of X is given by:-
⎧θ (1 - θ ) x -1 if x = 1,2,3,...
P( X x) = ⎨
⎩0 Elsewhere
Page 7 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
1−θ
) The variance of geometric distribution is Var ( X ) = .
θ2
θ
) The moment generating function of the geometric distribution is M (t ) = .
1 − (1 − θ )e t
Example: A laboratory technician successively tests the blood sample of patients for HIV. It is
assumed that the probability of a patient to be HIV positive is 0.3. What is the probability that
the laboratory technician detects or encounters HIV for the first time in the 10th blood sample?
Solution:
Clearly, the problem pertains to the geometric distribution. We are given 10 1
9 0.3. Note that the value 10 is the number of trials that we have to conduct till we get
the first success. If the first success is found in the 10th trial the implication is that we have
encountered a total of 9 failures.
Thus, P( X = x) = θ (1 - θ ) x ⇔ P( X = 9) = (0.3)(1 − 0.3) 9 = 0.0121
Exercise 1: For the above example, find the probability of getting HIV in the first blood sample.
Solution: P ( X = x) = p(1 - p) x ⇔ P ( X = 0) = (0.3)(1 − 0.3) 0 = 0.3
Exercise 2: For any random variable X with geometric distribution, on which trial do we get the
highest probability of encountering success for the first time.
Solution: Maximum probability of encountering the first success occurs at the first trial, 0
The geometric distribution is used in Markov Chain models, particularly metrological models
used to forecast weather cycles and precipitation amount. A random variable X that has a
geometric distribution is often refereed to as a discrete waiting time random variable. It
represents how long (in terms of the number of failures) one has to wait for a success.
The core question here is “How long should we wait for the kth success to occur? “
The Pascal distribution, also called negative binomial distribution, is a generalization of the
geometric distribution. In geometric distribution we were interested about the occurrence of the
1st success. When it comes to the Pascal distribution we will be interested in the occurrence of
the kth success.
Range space: X ∈ { k, k+1, k+2, …}
Page 8 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
Notation: X~NegBin(k, θ).
Suppose we have a sequence of independent Bernoulli trials, each have a probability of success
equal to θ, and let X be the number of the trial that results in the k-th success. Then X is said to
have a negative binomial distribution with parameters k and θ. Here k must be a positive integer
and θ∈(0, 1).
Example: We may interested in the probability that the tenth child exposed to a contagious
disease will be the third catch it, the probability that the fifth person to hear a rumor will be the
first one to believe it, or the probability that a burglar will be caught for the second on his eighth
job.
If there kth success is to occur on the xth trial, there must be k-1 successes on the first x – 1 trials,
and the probability for this is
⎛ x − 1⎞ k −1
P(k − 1; x − 1;θ ) = ⎜⎜ ⎟⎟θ (1 − θ ) x − k
⎝ k − 1 ⎠
The probability of successes on kth trial is θ, and the probability that the kth success occurs on the
xth trial is, therefore
⎛ x − 1⎞ k
θ * P(k − 1; x − 1;θ ) = ⎜⎜ ⎟⎟θ (1 − θ ) x − k ; x = k , k + 1,k + 2, ...
⎝ k − 1⎠
Definition (negative binomial): A random variable X has a negative binomial distribution, and it
is referred to as a negative binomial random variable, if and only if its probability distribution is
given by
⎛ x − 1⎞ k
P (k ; x;θ ) = ⎜⎜ ⎟⎟θ (1 − θ ) x − k ; x = k , k + 1,k + 2, ...
⎝ k − 1 ⎠
Example: If the probability is 0.40 that a child is exposed to a certain contagious disease will
catch it, what is the probability that the tenth child exposed to the disease will be third to catch
it?
Solution:
⎛10 − 1⎞ 3
P(3;10;0.40) = ⎜⎜ ⎟⎟0.4 (1 − 0.4)10−3 = 0.0645
⎝3 −1 ⎠
Page 9 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
Exercise 2: A laboratory technician successively tests the blood sample of patients for HIV. It is
assumed that the probability of a patient to be HIV positive is 0.3. What is the probability that
the laboratory technician detects or encounters the 5th HIV in the 10th blood sample?
k
) The mean of the Pascal distribution is E ( X ) =
θ
k ⎛1 ⎞
) The variance of the Pascal distribution is Var ( X ) = ⎜ − 1⎟
θ ⎝θ ⎠
k
⎛ θe t ⎞
) The moment generating function of the Pascal distribution is M (t )
X = ⎜⎜ ⎟⎟
⎝ 1 − e ( 1 − θ) ⎠
t
INTRODUCTION
Recall that one of the assumptions of the binomial distribution is that the probability of success,
, will remain constant. However, there are cases where this assumption is not maintained. In the
case of dependent events the probability of success, , is not stationary, but changes from trial to
trial. The proper distribution applicable here is the hypergeometric distribution.
For example, there may be ten balls in a bag of which four are red and six are black. Three balls
are taken out of the bag at random without replacement. The probability of the number of red
balls in the three is distributed hypergeometrically.
Page 10 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
replacement (at random), let X be a random variable that stands for the number of objects that
refers to successes. Then the random variable X has a Hypergeometric distribution with P.d.f
⎛ M ⎞⎛ N − M ⎞
⎜⎜ ⎟⎟⎜⎜ ⎟
⎝ x ⎠⎝ n − x ⎟⎠
P ( X = x) = if x = 0,1,2,..., Min(n, M) and 0 ≤ n - M < x
⎛N⎞
⎜⎜ ⎟⎟
⎝n ⎠
Assumptions: The hypergeometric distribution is applicable under the following conditions:
Example: workers are selected at random out of ten for a refresher course. Assuming that four of
the ten had earlier been sent to such courses, find the probability that exactly 1 of the three had
previously taken such courses.
Solution:
In this problem the success is to find a person who had taken the course.
Let X be the random variable that denotes the number of individuals who had taken the course.
⎛ M ⎞⎛ N − M ⎞
⎜⎜ ⎟⎟⎜⎜ ⎟
⎝ x ⎠⎝ n − x ⎟⎠ ( 4 C1 )( 6 C 2 )
⇔ P( X = x) = ⇔ P( X = 1) = = 0.5
⎛N⎞ C3
⎜⎜ ⎟⎟ 10
⎝n ⎠
Solution:
P ( X ≥ 2) = P( X = 2) + P( X = 3)
⎛ M ⎞⎛ N − M ⎞
⎜⎜ ⎟⎟⎜⎜ ⎟
⎝ x ⎠⎝ n − x ⎟⎠ ( C )( C )
⇔ P( X = x) = ⇔ P( X = 2) = 4 2 6 1 = 0.3
⎛N⎞ 10 C3
⎜⎜ ⎟⎟
⎝n ⎠
Page 11 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
⎛ M ⎞⎛ N − M ⎞
⎜⎜ ⎟⎟⎜⎜ ⎟
⎝ x ⎠⎝ n − x ⎟⎠ ( 4 C3 )( 6 C 0 )
P( X = x) = ⇔ P( X = 3) = = 0.033
⎛N⎞ C3
⎜⎜ ⎟⎟ 10
⎝n ⎠
⇔ P ( X ≥ 2) = 0.3 + 0.033 = 0.333
Mn
) The mean of the hypergeometric distribution is E ( X ) =
N
nM ( N − M )( N − n)
) The variance of the hypergeometric distribution is Var ( X ) =
N 2 ( N − 1)
) The moment generating function of the hypergeometric distribution is not useful.
Exercise 1: Out of 100 applicants to Addis Ababa University (statistics Department) 80 are from
Addis Ababa. If 15 applicants are selected at random find the probability that 10 of them are
from Addis Ababa. Ans: 0.1008
Exercise 2: A box contains 3 red and 7 white marbles. If 2 marbles are chosen at random,
without replacement, find the probability that 1 of them is red. Ans: 0.4667
INTRODUCTION
Named after the French mathematician Simeon Poisson, Poisson probabilities are useful when
there are a large number of independent trials with a small probability of success on a single trial
and the variables occur over a period of time. It can also be used when a density of items is
distributed over a given area or volume.
Experience indicates that the Poisson P.d.f may be used in a number of applications with quite
satisfactory results. For example, the number of automobile accidents in some unit time is often
assumed to be a random variable, which has a Poisson distribution. Likewise, the number of
insurance claims in some unit of time is also assumed to be a random variable with Poisson
distribution.
Page 12 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
Definition (Poisson Distribution): A random vaiable X is defined to have a Poisson distribution
if the probability distribution of X is given by
⎧ e −λ λ x
⎪ if x = 0,1,2,3,...
P ( X = x) = ⎨ x! where λ > 0
⎪0 Elsewhere
⎩
A random variable that has a P.d.f. of the above form is said to have a Poisson distribution, and
any such P(x) is called a Poisson P.d.f.
Lambda (λ) in the formula is the mean number of occurrences. If you're approximating a
binomial probability using the Poisson, then lambda is the same as np.
In 1837 Poisson published the derivation of the above distribution that bears his name. He
approached the distribution by considering limiting forms of binomial distribution. The Poisson
distribution may also arise for events occurring “ random and independent” in time.
Assumptions: The Poisson distribution is derived under the following three assumptions:
Note that:
If represents the number of occurrence of some event during a time interval of length t, then
, / represents the expected rate at which particles are
estimated. If represent the number of occurrence of some event within a specified volume V,
then , / represents the expected density at which starts
appear.
Example: If there are 500 customers per eight-hour day in a checkout line, what is the
probability that there will be exactly 3 in line during any five-minute period?
Page 13 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
Solution:
The expected value during any one five minute period would be 500 / 96 = 5.2083333. The 96 is
because there are 96 five-minute periods in eight hours. So, you expect about 5.2 customers in 5
minutes and want to know the probability of getting exactly 3.
⎛ - 500 ⎞ 500 ⎞3
⎜ e 96 ⎟⎛⎜
500 ⎜⎝ ⎟⎝ 96 ⎟⎠
P(x = 3, λ = )= ⎠ = 0.1288
96 3!
Example: Suppose that X has a Poisson distribution with mean 2. Then find P 1≤X .
Solution:
⎧ λ x e − λ 2 x e −2
⎪ = if x = 0,1,2,...
The P.d.f of X is P ( X = x ) = ⎨ x ! x !
⎪0 otherwise
⎩
20 e −2
P(1 ≤ X ) = P( X ≥ 1) = 1 − P( X = 0) = 1 − = 1 − e −2 = 0.856
0!
A Poisson process is one in which the intervals between successive occurrences have
independent, identical, exponential distribution and consequently the number of occurrences in a
specified time interval has a Poisson distribution.
Page 14 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
a) Events that occur in one time interval are independent of those occurring in any other
non-overlapping time interval.
b) The probability of the event occurring is proportional to the length of the time interval.
c) The probability that two or more events occur in a very small time interval is so small
that it can be neglected.
9 The number of telephone calls coming into a switch-board during a fixed period of time,
like an hour,
9 The number of deaths in motor accidents every month in a large city,
9 The number of typing errors per page,
9 The number of defectives in a manufactured article,
9 The number of customers entering a restaurant every minute.
Generally, if a process leads to the Poisson distribution, that process is called the Poisson
process.
Remark: The Poisson distribution is usually applied in the case of calculating the probability of
rare events, like probability of 10 pages missed from a book of 100 pages.
Example: If the probability of an individual suffers from a bad reaction from injection of a given
serum is 0.001, determine the probability that out of 2000 individuals
a) exactly 3 will suffer a bad reaction, b) more than 2 individuals will suffer a bad
reaction.
Solution:
Page 15 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
Let X be a random variable that stands for the number of individuals who suffer a bad reaction.
λ = np = 2000 (0.001) = 2
⎧ λ x e −λ 2 x e −2
⎪ = if x = 0,1,2,..
⇔ P ( X = x ) = ⎨ x! x! .
⎪0 elsewhere
⎩
2 3 e −2
a) P ( X = 3) = = 0.180447
3!
b) P( X > 2) = 1 − P( X ≤ 2) = 1 − P( X = 0) − P( X = 1) − P( X = 2)
2 0 e − 2 21 e − 2 2 2 e − 2 1 2 2
= 1- − − = 1 − 2 − 2 − 2 = 0.3233
0! 1! 2! e e e
Exercise: On an average a ship arrives every second day at a certain port. What is the probability
that three or more ships would arrive on a randomly selected day?
Solution: A ship arrives on an average once every second day. Hence we have λ=1/2=0.5,
which is the average number of arrivals per day.
λx e − λ
P(X = x) =
x!
The required probability is P ( X ≥ 3).
P ( X ≥ 3) = 1 − P ( X = 0) − P ( X = 1) − P ( X = 2)
(0.5) 0 e −0.5 (0.5)1 e −0.5 (0.5) 2 e −0.5
= 1- − −
0! 1! 2!
1 1 1
= 1 - 0.5 − 0.5 − 0.5 = 0.0144
e 2e 8e
Exercise: On an average 12 people per hour enter the consulting clinic of a medical practitioner.
What is the probability that exactly three people will enter the clinic during a 10-minute period?
Solution: Ans: 0.1804
10 12
λ 2
60
2
3 0.1804
3!
Page 16 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
7.6 MULTINOMIAL DISTRIBUTION
INTRODUCTION
Some trials have more than two possible outcomes. For example, the outcome for a driver in an
auto accident might be recorded using the categories “uninjured,” “injury not requiring
hospitalization,” “injury requiring hospitalization,” “fatality.” When the trials are independent
with the same category probabilities for each trial, the distribution of counts in the various
categories is the multinomial.
!
, ,…,
∏
The binomial distribution is the special case with k = 2 categories. The marginal distribution of
the count in any particular category is binomial. For category , the count has mean and
standard deviation 1 . Most methods for categorical data assume the binomial
distribution for a count in a single category and the multinomial distribution for a set of counts in
several categories.
Example: In a given city on Saturday night, Channel 12 has 50% of the viewing audience,
Channel 10 has 30% of the viewing audience and Channel 3 has 20% of the viewing audience.
Find the probability that among eight television viewers in that city, randomly chosen on a
Saturday night, five will be watching Channel 12, two will be watching Channel 10, and one will
be watching Channel 3.
8!
5, 2, 1 0.5 0.3 0.2
5! 2! 1!
0.0945
Page 17 of 18
Introduction to probability theory (Stat 276) Department of Statistics (AAU)
Exercise: Genotypes AA, Aa, and aa occur with probabilities , , . For n = 3 independent
observations, the observed frequencies are (n1, n2, n3).
a) Explain how you can determine n3 from knowing n1 and n2. Thus, the multinomial
distribution of (n1, n2, n3) is actually two-dimensional.
b) Show the set of all possible observations, (n1, n2, n3) with n = 3.
c) Suppose , , 0.25, 0.50, 0.25 . Find the multinomial probability that
, , 1, 2, 0 .
d) Refer to (c). What probability distribution does n1 alone have? Specify the values of the
sample size index and parameter for that distribution.
Page 18 of 18