0% found this document useful (0 votes)
6 views25 pages

Standard Discrete Probability Distribution

The document discusses standard discrete probability distributions, focusing on finite sample spaces and the contributions of Jacob Bernoulli to probability theory. It covers various distributions including degenerate, discrete uniform, Bernoulli, and binomial distributions, detailing their definitions, properties, and applications. The document also emphasizes the importance of probability models in statistical analysis and inference.

Uploaded by

ap8011184
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views25 pages

Standard Discrete Probability Distribution

The document discusses standard discrete probability distributions, focusing on finite sample spaces and the contributions of Jacob Bernoulli to probability theory. It covers various distributions including degenerate, discrete uniform, Bernoulli, and binomial distributions, detailing their definitions, properties, and applications. The document also emphasizes the importance of probability models in statistical analysis and inference.

Uploaded by

ap8011184
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Ginmial

Chapter
Standard Discrete Probability
Distributions : Finite Sample Space
lacob Bernoulli (1654-1705) was a mathematician born
in Switzerland. Bernoulli discovered the constant 'e' by
studying a question about compound interest. Jacob
Bernoulli's most original work was published eight years
after his death. The work is still a work of the greatest
significance in the theory of probability. Many examples
are given on how much one would expect to win playing
various games of chance. The term Bernoulli trial resulted
from this work. Bernouli distribution is named after Jacob
Bernoulli.
Jacob Bernoulli
Contents
6.1 Introduction
6.2 Degenerate Distribution (One Point Distribution)
6.3 Discrete Uniform Distribution
6.4 Moments
6.5 M.G.F. of Discrete Uniform Distribution
6.6 Distribution of Sum of Two Discrete Uniform Random Variables
6.7 Bernoulli Distribution
6.8 Moments of Bernoulli Distribution
6.9 Distribution of Sum of independent and ldentically Distribution Bernouli Random
Variables
610 Binomial Distribution
6.11 Applications of Binomial Distribution
6.12 Moments of Binomial Distribution
6.13 M.G.F. of Binomial Distribution
6.14 Recurrence Relation
6.15 Fitting of Binomial Distribution
6.16 Mode of Binomial Distribution
o.17 RecUrrence Relation between Moments of Binomial Random Variable
6.18 Nature of the Binomial
6.19 Additive Property Distribution
(6.1)
F.Y.B.Sc. Statistics (Paper- 1) 6.2 Standard Discrete Probability

6.20 Conditional Distribution of Xgiven X+ Y=n pistributions (Finite


6.21 Model Sampling from Binomial Distribution
6.22 Hypergeometric Distribution
6.23 Applications of Hypergeometric Distribution
6.24 Binomial Approximation of Hypergeometric Distribution
6.25 Mean and Variance
6.26 Alternative Form of Hypergeometric Distribution
Key Words :
Discrete uniform distribution, Bernoulli trials, Binomial distribution,
Distribution. Hypergeometie
Objectives :
Understand the need of standard probability distributions as models.
Understand the specific situations for the use of these models.
Learn the probability distributions and compute
probabilities.
Learn interrelations among the different probability
distributions.
6.1 Intro duction
In the previous chapters, we have
probability distributions. For a discretediscussed
the general theory of univariate and
r.v. we saw how the p.m.f. can be derived bivanat
using the
underlying probability structure on the sample space of a
many a times, the variable of interest is random experiment. Nonetheless
observed to follow a Specific pattern which can b
described by a standard probability distribution. The p.m.f.
mathematical form. These probability distributions can be appliedcan to
be expressed n
a variety of real
situations which possess some common features.
models'. Hence, these are also called as proDa
In this
viz, discretechapter we shall study a few important
uniform, Bernoulli, binomial, and discrete standard probability distrioe
Need of probability model : hypergeometric distribution.
After collection of data, we
prepare frequency distribution. The
frequency distribution gives the idea about the histogramadand
studying sample is to draw inference about the variation pattern. Main aim1 of obtaining
inference the first step is to corresponding
t fit a probability model. To
the histogram as well as certain other characteristic decide
population.
the model,
properties. Here one
In order to
we consider
draw
obsere
has to som'
patterns and suggest very approximate way to guess the
probability model.
KYBSC.Statisticss(Paper- ) 6.3 Standard Discrete Probability Distributions (Finite)
Pattern Nature Approximate model
All bars have almost equal Uniform distribution
height.
Symmetric or positively Binomial distribution
skewed negatively S 2 de
skewed finite range
frequencies increase and
drop slowly.
Skewed distribution. Sudden Poisson distribution
increase in heights of bars
and relatively slow decrease
in heights of bars.
Sudden decrease in the Geometric
heights of bars. distribution

Skewed distribution. Sudden Negative binomial


increase in heights of bars, distribution.
but very slow decrease in
heights of bars.
Distribution)
6.2 Degenerate Distribution (One Point
Consider the following situation : Suppose a coin (as in the movie Sholay) has Heads
we
on both its sides. Then whenever you toss the coin, it is going to show up Heads. Thus,
say 'Head' will be turned up with probability 1. Such a distribution is called a degenerate
distribution or one point distribution.
Definition : Let X be a discrete random variable. X is said to follow a Degenerate
Distribution if its p.m. f. is given by
P(X =k)= 1 keR
and P(X = x) = 0 for all other x, # k
The distribution is also termed asOne point distribution.Basically, the variable is taking
only one value and not a variable in true sense. The degenerate distribution is localized to a
Single point.
Mean and variance: Let X follow a degenerate distribution at X = k. Then
Mean of X = E(X) =k
Var (X) = 0
Proof : Suppose Xcould have taken values X1, X2, so Xn
Then E(X) = xP()
=k P(k)
=k
F.Y.B.Sc. Statistics (Paper- II) 6.4 Standard Discrete Probability
Var (X) = E(X) -[E(X))
= Xx P(X) k
Distributions (F

= k'-k?
= 0
Note : The use of degenerate distribution in probability theory is that it can be v
the limiting distribution of many common distributions in which the scale
parameter t
zero, so the distribution function concentrates onto a single point.
6.3 Discrete Uniform Distribution
Consider the following situation : Suppose, a class contains 50 students havine
numbers from 1 to 50. The class representative should be selected at random. Therefor
Toll number is selected randomly from 1 to 50. Thus, if X denotes the roll number seler
then since all numbers are equally likely, the p.m.f. of X is given by,
1
P(x) = 50 X = 1,2, ...., 50.
= 0 : otherwise.
Such a distribution is called as a discrete uniform distribution.
Definition : Let X be a discrete r.v. taking values 1, 2, .. n. X is said to follow 1
discrete uniform distribution if its p.m.f. is given by
1
P(X = x) = n X =1,2, ..., n.
=0 otherwise.
'n' is called as the 'parameter' of the distribution. By 'parameter (s) of a
mean the constants in the p.m.f. Whenever, the parameter value is distribution,
known, the distributoue
known completely, so that probabilities of all events as well as quantities such as
variance can be computed. For different values of the parameter, we get
different prot
distributions of the same kind. The above distribution is given the name 'uniform
because it treats all the values of the variable uniformly) Thus, the disti
distribution is applied whenever all values of the r.v. are equally likely. We discrete
give beto
such situations.
1. The birthday of a person. It may be either
Sunday,.
probability. Thus giving codes to the days as Sun Monday,
=1, Mon ...=.2. Saturday wi
..., Sat=7.
uniformn distribution withn=7.
2. Let X denote the number on the face of an
unbiased die when it is rolled.
P(x) = 6
X=1,2, ....6
0
otherwise
3. A computer gencrates a digIt randomly from 0 to 9
PX) = 10 X=0, 1,...9
= 0
otherwise.
Statistics (Paper- I) 6.5 Standard Discrete Probability Distributions (Finite)
Figure given below shows the bar diagram of a discrete uniform distribution with
parametern.
P (x)

X
1 2

Fig. 6.1
6.4 Moments
Let Xfollow a discrete uniform distribution with p.m.f.
P(X) = X= 1,2, ..., h.
= 0; otherwise
n

= Mean =EX) =X=1 xPX=x


n(n+1) n+1
2n =2
, = Variance = E (X?) - [E (X)1²
(0 +1) (2n +1)
4 = E(X) =E xP)=E*= 1

Hence,
(n + 1) (2n + 1) (n +1) n°
Var (X) = 6 4

S.D. (X) = V(n-1)12


1 n(n+1)2
E*=
H, = NP()= x=1 4

3(n +) (2n + ) 2(n + 1)'


n(n+12 8
4 6.2

symmetric.
32 = 0. The distribution is
F.Y.B.Sc. Statistics (Paper- I) 6.6
Standard Discrete
Probability
6.5 M.G.F. of Discrete Uniform Distribution
Suppose X follows a discrete uniform distribution over
Mx (t) =
(1, 2, ..., n}. Distributions
ThecMGF.fY
(

Proof: Mx () = E(e")
= e P(x)x=1

l+e+e. +eD)

6.6 Distribution of Sum of 1+x+..**


TNo Discrete
Sum of two
independent discrete
Uniform Random Variable
from the following example. uniformr.v.s. not discrete uniform. This can be se
is

Example 6.1 : Let X and Y be


Obtain the distribution ofZ= X+ Y. independent discrete uniform r.v.S. with parameter
Solution: The p.m.f. s of Xand Y are given by
P, (X= x) =
X = 1,2, ...,n

and
otherwise
P, (Y = y) =
n y= l, 2, ..., n
=0
Define Z= X+ Y. Obviously Z takes otherwise
values from 2 to 2n.
Consider, P(Z=2) = P(X= 1, Y= 1) = P, (1) P, (1) =
X and Y are independent.
P(Z 3)PX1 Y= 2) +P(X = 2, Y= )=
and so on until.

P (Z= n'+ 1) = n' n


AKBSC
Statistics (Paper- ) 6.7
Standard Discrete Probability Distributions (Finite)
RorP(Z=n+2) (heefollowing (n ) combinations are possible.
X:2,3, ..., n.
Y:n,n-1l, ...., 2.
n-1
Hence, P(Z=n+ 2) = n and so on.

Lastly,. P(Z= 2n) = P(X=n, Y=n) =


n?
Hence, the probability ydistribution of Z= X+Yis given by-
2 3 2n
(n+1) (n+2)
2 n-1 1
P(z) n?
n
n+1
n n n? n? n
LAie not the discrete uniform distribution. The distribution is called as triangular
ion because its bar diagram resembles triangular shape
67 Bernoulli Distribution
Consider an experiment of tossing a coi.
Detine X = 1 if head turns up
= 0 if tail turns up
la pdenote the probability of geting head' and 'q' denote the probability of geting
1
it Thus 0<p <land 4=1-p. If the coin is unbiased, then p=q =5. The p.m.f. can be
expressed in the following mathematical form
X= 0, 1
P() = p q-x; otherwise
=0
0<p <l, p+q =1.
This distributicon is known as a Bernoulli distribution with parameter 'p. It was
discovered bya Swiss mathematician James Bernoulli, in 1713. The distribution is applied
Serever the experiment results in only two outcomes. One of the outcomes is termed as
Sccess and is coded as "1 i.e. X = 1). The other outcome called 'failure' is coded as '0.
Ducn an experiment is also called as a Bernoulli Trial'. Following are some real life
Slaions, where Bernoulli distribution is used.
Sex of a new born child is recorded in a hospital. Male = 1, Female =0.
ems ina consignment are classified as 'defective' or 'non-defective'.
3. Seeds are Sown; germination of a seed is termed as success.
4. passes or fails.
Astudent appears for the examination. He
pP=q= 1/2, then P (x)=, x=0, 1, which can be treated as a discrete
nitormn distribution.
Moments of
LaX follow BernoulliBernoulli
distributionDistribution
with parameter 'p'. Therefore, its p.m.f. is given by
P(x) =p'q; X=0,I

= 0 otherwise
Standard Discrete Probability Distributions s
F.Y.B.Sc. Statistics (Paper-
) 6.8
(Finite)
2 pql-*=p
Mean = E(0) = 2 xP () =x=0
1

E(X) = x=0 xP() =x=02


xpq'-K=p
L, = Var (X) = E(X) - [E (X)F=p-p'=p(l-p) =pg

Similarly, 4, = x'P(x) =p
X=0

=-3u, H, +2u =p-3p+ 2p' = pq (q-p)


is symmetric if p=.
Observe that 4, =0, if q=p = .Hence the distribution

In general, 4, = xP(X) = 0xq+1 xp =p


Thus all raw moments are equal to p.
M.G.F. of Bernoulli Distribution :
IfX- Bernoulli (p), then Mx (t) = e P(x)
x=0
= P(0) + e P(1)
= P(0) +e P(I)
=9tpe'
Illustration 1 : If the M.G.F. of a r.v. is

then identify the distribution of X,


Solution : Observe that the form of the M.GF. is
Mx (0) = (g +pe) where p +q=1
which is the M.G.E. of Bernoulli r.V. with probability of success p= . Hence by uniqueness
property of M.GE, X- Bernoulli
6.9 Distribution of Sum of Independent and Identically
Distributed BernouliRandom Variables
Let YË, i = 1, 2, n be n independent Bernoulli r.v.s. with parameter 'p. That i5.
P[Y{= 1]=p and P[Y;=0] =4, i=1,2, .., n.
n

Define X = Y;. Note that X counts the number of "'s ie. 'successes' in n
i=1
independent Bernoulli trials.
Statistics (Paper - II) 6.9
Standard Discrete Probability Distributions (Finite)
orderto derive P[X= X], we have to calculate probability of 'x' successes in n trials.
a
istsder
apnarticular sequence of xsuccesses and remaining (n - x) failures as follows.
101100.. 1
Here I
occurs Xtimes and 0 occurs n -x times. Due to independence, probability of
sequenceIS given by -
echa
PP..p q:9...q = p'qh-x

(n- x)times
X times

Uawever, the succeSses (1's) can occupy any x places out of n places in a sequence in
ways. Therefore, using the addition principle, we get,
P[X=x] X= 0,1, ....n

= 0 otherwise
which we discuss in the next
This result leads us to the famous binomial distribution
section.
(April 2012)|
6.10 Binomial Distribution
to follow a binomial
Definition:A discrete r.v. X taking values 0, 1, 2, ... n is said
distribution with parameters n andpif its p.m.f. is given by.
P[X=x] =P)=Pq-x; X =0, 1, ... n

0<p<1
q =1-p
otherwise
= 0
of n and p are available in
Notation:X’B (n, p). The values of P(x) for various values
statistical table.
n

Remark : Note that P() =x=0


X P'q-= (p +q"= 1.
x=0
the name
Ihe probabilities are terms in the binomial expansion of (p + ), hence
binomial distribution' is given.
Applications of Binomial Distribution
distribution.
binomial distribution is applied widely due to its relation with the Bernoulli
Me have scen in Section 6.8 that sum of independent, identically distributed Bernoulli r.v.s.
ONS binomial distribution, In other words, ifn independent Bernoulli trials are pertormed.
enumber of successes follows binomial distribution. For instance, if a coin with p' as
of 'heads'
probability of head' (success) is tossed ntimes, independently, then the number
ws a binomial distribution with parameters n and p. Thus, number of successes in n'
6.10 Standard Discrete Probability
F.Y.B.Sc. Statistics (Paper- )

independent Bernoulli trials follows binomial distribution with parameters n and p.


Distributions (

denotes the probability of success in a single trial. were


application of f binomial
- The following conditions should be satisfied for the
The random experiment should be Bernoulli trial. That is, it should
diresulstt riinbaic
of the twO possible distinct outcomes. One of them is termed a eithe
other a failure'.
'success and
(i) The Bernoulli trial is performed repeatedly a fixed number of times sav 'n'
(jü All the trials are independent. Outcome of a trial is not affected hy
outcomes and does not affect the future outcomes.
iv) The probability of success in any trial is 'p' and is constant for each trial.
Probability of failure is q=l-p.
Remark : 1. IfX’B (n, p), then Y =n-X is the number of failures. Hence
changing the roles of successes and failures, we get,
Y = n-X; Y’B(n, q)
Binomial distribution is easily applied in case of SRSWR. To see this, consider
bag containing 4 red and 5 black balls. Suppose 3 balls are drawn from the bag using simpe
random sampling with replacement (SRSWR). Thus at every draw probability of red bul
remains 4/9 as the ball drawn is being replaced. Also the draws are made independenty.
Hence, number of red balls in the sample will follow binomial distribution with parametet
n=3 and p= 4/9.
Following are sor.. real life examples of binomial random variable.
1. Number of defective items in a lot of n items produced by a machine.
2. Number of male births out of n births in a hospital.
3. Number of correct answers in a multiple choice test.
4. Number of seeds germinated in a row of n planted seeds.
5. Number of rainy days in a month.
6. Number of recaptured fish in a sample of 'n' fishes.
In all the above situations, p, the probability of success is
assumed to be constant.
Example 6.2 : Acoin with p = 1/3 as the probability of head is tossed 6
probability of getting () 4heads, () at least 2 heads, (iü) at most I head. iv)times.
4tails,Fie(
least one tail, (vi) all tails.
Solution: Let Xdenote number of heads obtained in 6 tosses of the
.X ’ B(n =6. p=1/3) and its p.m.f. is given by, coin.

(0
otherwise
Statistics (Paper- I) 6.11
Standard Diserete Probability Distributions (Finite)
P[X=4) = P4) = =00823
P(X22) = 1-P(X<2)
= 1-[P (X =0) +P (XF 1)]

= 0.8683
P(XS1) = P(0) + P(1) = 0.1317 (see (i))
() Getting 4tails is equivalent to getting 2 heads.
Required probability = P(X=2)

of 'getting all 6 heads' should not


() Getting at least one tail means that the only event
happen. Therefore required probability is given by
1-P(X=6) = 1-4=09998
(vi) Geting 'all tails' is the same as getting no 'head.

Probability = P(X=0) = q =7 = 0.0878


inomial probabilities using MS-Excel: or
Microsoft) Excel functions to compute cumulative probabilities
can use example, in above example we want to
probabilities of binomial distribution. For
compute P(X=4). choose
In menu bar. It gives function category
Excel, go to a cell and click f on
I will show the
following
\Maistical. It gives function name, choose
BINOM DIST.
dilog box. parameter
Trials as 10. It is a
. Enter the values for Number_S as 4 since we find
P(X = 4),
to compute individual

ybabPirlotbyabil ty
we want
as 0.33333. It is a parameter P and since

Enter the value'False' for Cumulative. ClickOK


EY.B.SC. Statistics (Paper- ) 6,12 Standard Discrete
Probability
Function Aguments
BINOMOIST
Distributin
Number s 4 -4
Trials 6 6

Probability_s 0.33333 0.33333


Cumulative FALSE
SFALSE
Returns the indvidual term binomial =0.082302058
dstribution probability.

Cumulativeisthealogical value: for the cumulative


distribution funcbon, use TRUE; for
probabity mass functon, use FALSE.
Formula result
tielb on this
0.082302058
function
OK Cancel

This will give the Fig. 6.2


answer as 0.082302058.
Altematively, you can just go to any
cell and type
=BINOMDIST(4,
If you want to
6, 0.3333,
compute FALSE)which will give you the
We can see how to cumulative probability, just same answer.
computer (1) P(X Sl) we use enter TRUE' for Cumulat1rve
-BINOMDIST(1,
Example 6.3 : In a
6,0.3333, TRUE)will return the P(X S 1)
rubber quality control of a rubber =0.1317.
than I tubes are
of the 10 randomly selected from department tube
rejected. Find thetubes is found to be each day's
production for manufacturing
t
of probability defective, of a
defectives in the lot is 0.3. of the rejection the
production inspection.Otheri
day's lot is lotapproved.
tubes.Solution: Suppose X denotes the number of production if the true pro

defective tubes in the 101randomlv


The
X’B
production lo Is accepl
found defective.
(n =10,
ed if not more =0.3) tlan one ube (l.e.
P(Accepting he lo) at the most

POX=0)+PX)
q*npq
(0.7+10 (0,3)(0.7Y
P(Rejection of the lot)
-(07Y3.7] 9.1493
1-01493 \8507
EVB.Sc. Statistics (Paper - ) 6.13 Standard Discrete Probability Distributions (Finite)

Example 6.4 :Arv. X B (n=6,p). Find p if 9P(X=4) P(X= 2).


Solution : P(X=x) =pq-; X =0, 1, ... n
Here, =6
9P (4) = P(2)

9p² = q
9p² = (1-p'= 1-2p +p²
8p²+ 2p-1 = 0
(4p - 1) (2p + 1) = 0
1
P =7or p = -
The value p= - is inadmissible. Hence, the answer is p-i
Example 6.5 : Suppose we denote by b (x; n, p) the p.m.f. pq-x; x =0, 1, ..., n.
Then prove the following identities.
(i) b(x; n, p) =b (n -x; n, q)
(i)) b (x; n+1, p) = pb (x- 1:;n, p) + qb (x; n, p).

Solution : () We know that=,)


n
n

b(x;n, p) =Pqx=qn-x pn- (a-)


= b(n- x;n, q)
(ii) Also, since , we get,
(n+1)
b:n+1.p) ="Pgi-=+DPqp

= qb(x; n, p) + p b (x- 1; n, p)
6.12 Moments of Binomial Distribution
Let X’ B(n, p). The p.m.f. is given by,
n
P (X) = X = 0,1, .... n
= 0 otherwise
For binomial distribution, computation of factorial moments is easier than raw or central
moments.
Standard Discrete
F.Y.B.Sc. Statistics (Paper-) 6.14 Probability
Consider. , mean xP(X) Distributions (

-pq 0

Xn!
P'qh-x
o X! (n -X) !
n!
2(x-)!(n -X)! Pq"-x
1

(n-1) !
= np 2 (x- )!(n-1-«- ) ) Plqn-x

np

= np (p +)-(Using binomial expansion)


= np
Hence, E(X)= mean = np

Now, Ho = EX(X- 1)]= x(x- 1) pq-x


x(x-)n!
X! (n-x) ! P
q-x
0
n

= n(n-1) p²X=2 n-2) P*-2 qn-2-(x-2)


= n (n- 1) p² (p + q)n-2
=n (n-)p
Hence, = E(X) =E[X (X- 1)] + E(X)
= Ha t
= nfn-- 1) p+ np
Therefore. Var (X) = h= , H, =n (n- ) p²+ np-n²p=npq
S.D. (X) = Vnpq
Note that mean = np > npq = variance.
Similarly. = E[X (X- 1) (X- 2)] = n (n- 1) (n
-2) p
and = E[X (OX- 1) (X-2) (X - 3)
=n (n- ) (n-2) (n-3) p
Further,
n (n - ) (n-2) p'+ 3n (n-
)p+ np
Statistics(Paper- I) 6.15 Standard Discrete Probability Distributions (Finite)
KKBSC
npq (q -P)

Hence, H = H-4u, H +6, H,


npq [1 +3 (n - 2) pql
Distribution
R13 M.G.F. of Binomial
IfX’ B(n-p) then
Mx() = (q + pe')"
My(t) = E(e)
Proof :
Mx() = e P(X=x)
0

0
Ee"C,p q*

= 0 'C (pe) q
= (q+ pe )"
identify the distribution
Example 6.6 : If the m.g.f. of a r.v. is Mx(t) = (0.4 + 0.6e'yo,
of X.
Solution : Since the m.g.f. matches with the form (q + pe)', by=uniqueness property of
10 and p 0.6.
n.g.f. Xfollows binomial distribution with parameters n =
Mean and variance of binomial distribution using m.g.f. :
returns the value of
We know that the first derivative of Mx(t) w.r.t. t evaluated at t=0
nean.

My (0 = tq +pe'y" pe
Evaluating this at t =0, we get,
d
Mx(0k=0
n(p + q
= np since p +q =l

N w, Mx() = n(g +pe" pe'

Therefore, dyz My() = npe' (n- 1D (q+pe pe' +nq + pey pe


6.16 Standard Discrete Probability
F.Y.B.Sc. Statistics (Paper- )

Accordingly. = np(n - )p+ np Distributions (Fin


Var(X) = h=
= np(n - )p+ np-np
= npq

C-g.f. of Binomial Distribution:


IfX’ B(n, p), then the cumulant generating function (c.g.f.) of Xis
Kx() = log (Mx(t)
= n log (q + pe')
6.14 Recurrence Relation
The binomial p.m.f. is given by
P) = pYqh-x; X =0, 1, ..., n

Hence, in order to calculate the probabilities one must evaluate which is atedion
job especially when n and x are large. There is a chain relation between the successi
probabilities, using which the calculations become easy. This relation is called recurrenge
relation.
When X ’B (n, p), observe that
P(X+1) =)plqr-x-l; X= 0, 1, ...,n -1
and P (X) X =0, 1, ..., n

P(x+ 1) x+1) n-x P


P (x) x+1 q

where x = 0, 1, ..,.n
Relation (i) is called as the recurrence relation. Using this, we get,
P(1) = n P(0), where P(0) =q

P(2) P P(I) etc.


Remark : The recurrence relation between probabilities is used while ittingthe
binomial distribution to agiven data.
CV8SC. Statistics (Paper-) 6.17 Standard Discrete Probability Distributions (Finite)

6Fitting of Binomial Distribution


nnANE, We haveafrequency distribution (N. ) concerming avariable X which takes
bs 0, 1, 2, ., n. We feel that the assumptions of binomiat distribution as given in
on 6.10 are satisfied. Accordingly, we would like to use binomial distribution as a
Sa' for the data. Using binomial distribution as a model, we can determine probabilitics
Fyarious Cvents regardmg the variable. This is known as fitting of binomial distribution to
thegivendata.
distribution on
Bting of a distnbution to a data means estinmating the parameters of the
Ahe hasis of the data and computing probabilities and expected frequencies.
the frequency
Following are the steps involved in fitting of a binomial distribution to
distribution (x, , i= 1, .., k). frequency
Step 1 : Parameter n is known. It is taken as the last value of x; with positive

binomial distribution (np) with X.


The parameter p is estimated by equating the mean of
the data mean. Hence.
X
n
where, = N
N= f

p means pestimate, q means q estimate etc]


Step 2: Since the p.m.f. of X is
Pq-X; X=0, 1, ..., n
P(x) =
= 0 Otherwise

P(0) = q" =(4)" compute the further probabilities.


Step 3 : Recurrence relation (See 6.12) is used to
n-X P P(x), x=0, 1, ...,.n- 1
P(x+ 1) = x+1 q
as
Step 4: Expected frequencies (E) are calculated
E, = NP (x)
observed frequencies (f;) are quite close to the expected frequencies (E), the
I the called 'Chi-square test' is
binomial model used is satisfactory. To ascertain this, a test
employed which is beyond the scope of this book.
Following example illustrates the above stated fitting procedure.
Example 6.7 : The following data give number of seeds germinated in l00 rows of 5
becds each. Fit a binomial distribution to the data and calculate expectedirequeneies.
2 3 5
X 1
10 20 30 15 15

Solution : Let X=Number of seeds germinated in a row of 5seeds.


F.Y.B.Sc. Statistics (Paper-I) Standard Discrete Probability
6.18

andDistributions
Assuming that seeds germinate independently of each other
(F
germination of aseed (p) is constant for all seeds, we say X’ B(n, p) where,
=2.35; i
n=5
= l00
probability
Step 1 : Now, X=

A-X = 25 047.:. Q = 0.53


n
<ako
Step 2:
Use mula
= 0.8868

Now, P(0) = P[X =0] = (9)" = (0.53)5 = 0.0418


n -x P (x) E,= NP (x)
X+1
0 5 0.0418 4.18
1 2 0.1853 18.53
2 1 0.3287 32.87
3 0.5 0.2915 29.15
4 0.1293 12.93
5 *0.0234 2.34
Total 1.000 100
The last probability is obtained so that total probability is 1.
6.16 Mode of Binomial Distribution (April 2012
Mode of a distribution (variable) is that value of the variable for which the p.m.f. attains
its maximum. In other words, if M is the mode, then the
p.m.f. increases till M and furthet
decreases. Obviously, if the p.m.f. is increasing, then the ratioP (x)/P (x- 1) should be >
and vice-versa. We know that for X’ B (n, p).
P(x+1)
P (x) ...)
X+1 q X = 0, 1, ..., n - 1
In what follows,
P(x)
P(x-1)=
n-(x-) p
X -1+1
(n+ 1) p-X
Xq + 1
Hence, from (2) we observe that,
P(x)
P(&-1) >1 if x< (n +)p
<1
if x> (n+ )p
if x =(n+
+
-1) p and then )p
Thus, the p.m.f. increases upto (n
decreases.
Statistics(Paper- ) 6.19 Standard Discrete Probability Distributions (Finite)
: (n+ 1)p is not an integer.
Case(i)
Dp isnot an integer, then the mode M=[(n+ 1) pl; where by [a] we mean the
(a+ a, That is [2.3] =2 or |[3.8] =3etc.
of
egalpart
(n+ 1)pIS an integer., the r.v. Xtakes this value.
Case(i):
P(x) = 1
Inthiscase, P(x- 1) where, x = (n + 1)p
is not unique.
P() = P(x-1) and both probabilities are maximum. Hence, mode For instance,
)p and (n +1)p-lare modes. The distribution is called bimodal. 1/2), then
hand, if X ’ B (8,
B(7. 1/2), then both 3 and4 are modes. On the other
(o+D)p=92 =4.5. Hence mode =4.
Moments of
7 Recurrence Relation between
Binomial Random Variable
the th raw moment and r
Tlustration 2 : Let X ’ B(n, p). Let , and u, denote
central moment respectively. Then
n r = 0, 1, 2, ...

|du, r = 1,2, ....


() t. = Pq dn +nr H-i
Solution : (i) We know that n

E xpqa-x
H = E(x) = X=0E P() =x=0 ... (1)

Differentiating both sides of (i) w.r.t. p, we get,

d
dp X=0
((xpg--p'(n-) q°-x)
(q = 1-p)

du,
P4 dp = 4
x=0
n

X=0 x=0

= Prl - npu (: p+q= )


Standard Discrete Probability
F.Y.B.Sc. Statistics (Paper- ) 6.20
Distributions
+np

d
= pq dp g
(i) Since u, is the lh çentral moment of X,
T=0,1,1
4r = E (X-np) =X=0 (x- np)rpqa-x
Note that the r.h.s. involves product of three functions of p. Differentiating both
(2) w.r.t. p., we get,
du, -pqh -x
= - n -np)r
dp
+x(*-np)pq-x
- (n -x) (x- np pqn-x-I
Multiplying both sides by pq, we get.
dy = -np q rHr-tq (x- np)x P(x)
P4 dp
-pE (x-np) (n x) P()
= -npqr Hrt (x-np) P(x) (qx pn+px)
= -npqr Ht(x- np)r*l P(x)
npqr r-+ rel

el=P4 dp + npqr Mr

= pq dn +nr r-t r=l.2.


Deductions : Using these relations, it becomes easy to compute higher order mome
To see this consider the recurrence relation to find central
moments.

Hrel = pq dp +nrHrt r=l,2


Now, Ho= 1, H,= 0 :. Taking r= 1, we
get,
e= pq dp tn Ho=Dpq
Further, Hi pq
=pq nq- np] =
npq (q - p)
AYBSC.Statistics (Paper -
-) 6.21
Standard Discrete Probability Distributions (Finite)
and
du,

= pq|1 + 3(n-2) pq] etc.


6.18 Nature of the Binomial Distribution
The coeficient of skewness
npq (q-pP)
(q-pA
L (npq)2
Vnpg/
Thus if q>p. then >0 and the distribution is positively skewed. On the other hand,
ifg <p. then Y<0 and the distribution is negatively skewed.
q>p I-p>p=p<
and q<p ’ p>
P (x)

Fig. 6.3 :p< /2, Positively Skewed


Therefore, when p < .the distribution is positively skewed (Fig. 6.3) and when p >
1
the distribution is negatively skewed. (Fig. 6.4). If p= q = = 0 ; the distribution is
symmetric. (Fig. 6.5).
P (X) P (X)

Fig. 64:p=z,Symmetric Fig. 6.5 :p> /2, Negatively Skewed


Standard Discrete
Probability
Distributions
6.22
F.Y.B.Sc. Statistics (Paper - II)
is given by,
The coefficient of kurtosis npg [1 +3(n-2) pql -3
n²p²q²
1-6pg
npq

h>0if 6pq<1 ’ leptokurtic


h<0if6 pq> 1 platykurtic
2=0if6pq =1 mesokurtic.
Example 6.8: Let X’B (n, p)
= 12
() Comment on the following : E(X)=7 and Var (X)
(ii) E(X) = 4 and s.d. (X) =3. What are the values of n, p and q ?
(iii) u, = 5, u, = 0.5. Find Var (X).
(iv) If n= 10, E(X) =5.Find p, Y and y,. Comment on the nature of the distributi
Solution : () np =7, npq = 12’ q=>1
The statement is false.
(ii) np = 4, Vnpq =3 npq =3
31
q =75P=7 and n =16
(iii) np = 5, npq (q-p) = 0.5
0.5
q(q-p)== 0.1
20q 10q 1 = 0 ’q= 0.585
Var (X)= npq = 5x0.585
= 2.925
(iv) n= 10, np
=5p=2 >q =1/2 Y =0
1-6pg 5-0.2
The distribution is npq
6.19 Additive Property
symmetric and platykurtic.
Theorem: Let X-
X’B(D,,Z p), Y-’ B(n2, P)
and Xand Yare independent. Thetn.
Proof : Let P, (x) and P2(9) =X +Y’B (n,+n,. p)
denote the p.m.f.s of Xand Y
P respectively.
(Z=2)=2 P(X=x) PY =z-X)
X0

X=0 P(0) P;(2--x)


Statistics
(Paper- ) 6.23 Standard Discrete Probability Distributions (Finite)

X=0
n

= p'g* ny - z
X=0

(n+ n,
= pqt-1, z = 0, 1,...n, +n,)
thep.m.ff. of B(n, t n2, p)
is
shich Z= X+Y’B (n, +n,, p)
Notethat in order to get the above property, 'p' must be the same
for both the
Remark:
dstibutons.
M.G.E
Alternative method :Using
The M.G.F. ofX is given by
Mx (t) = (q + pe'"
by
The M.G.F. ofYis given
My (t) = (q +pe)"
Z = X+ Y
Let
Then by additiveproperty of M.G.F.,
pe')"
Mz (0) = Mx () My () =(q + pe') (g +
= (q+pe')" *
distribution.
whichis the M.G.F. of B(n + n2, p)
. By uniqueness property of M.G.F.,
Z- B(n + n2, p)
follows binomial distribution with parameters n and P. Find
Example 6.9: Suppose X
probability distribution of Y=n-X.
X is,
Solution :X’B (n, p), hence M.G.F. of
Mx(t) = (q +pe)"
M, (0) = Ele=E(e Mxe ()
= e"Ee=e"
= e (q +pe
= [e (q +pe )]=(p + qey
= M.G.F. of B(n, q)
B (n, g).
Hence, by uniqueness property we conclude that n- Xfollows
6.20 Conditional Distribution of X given X + Y =n independent.(AprilThen2012)the
Orem : Let X - B (n,, p): Y’ B (n,, p). X and Y are
conditional I distribution of Xgiven X+Y =n is
F.Y.B.Sc. Statistics (Paper- I) 6.24 Standard Discrete
Probability Distributlions
P[X =xlX+ Y=n]=
x=0,l,
Proof : Using the additive property, we get,
X+Y ’ B(n,+ n, P)
Consider,
P[X=x, X+ Y=n]
P[X=x|X +Y =n] = P[X + Y =n]
P[X=X, Y=n-x]
P[X+ Y=n]
P[X=x] P[Y=n-x] .. X and Y
P[X+Y=n] are independent
p-g2-n+x
n-x
n+n; p"qy +ny -n
n

X=0, 1,2,.n
n

which is the p.m.f. of Hypergeometric distribution.


6.21 Model Sampling from Binomial Distribution
In many studies the statistician is interested in studying the behaviour of a characteristie
by simulating observations on it. Such studies are helpful as they mimic the natural
phenomenon.
For example, if the quality engineer knows beforehand that the probability of a defecive
article in a batch of say 100 is 0.02, then he would like to generate fictitious batches, eacho
size 100 which will have defect proportions around 0.02. This is enabled by genertns
observations from Binomial distribution with parameters n = 100 and p = 0.01.
generated samples will give him the number of defective articles in the batches. These dau
can be further analysed statistically.
The following procedure describes how to obtain a model
(random) sample o
Nfrom a B (n, p) distribution using MS-Excel.
Step 1 : Using MS-Excel, obtain the cumulative probabilities for X= 0. 1. 2. -.. D.
command for getting cumulat1ve probability for say, X=xis
=
BINOMDIST
(x, n, p. TRUE).
Step 2 : Select random number 'y between 0 and 1. The following command in
MS-Excel can be used.
RAND)
KYESC.Statistics (Paper - ) 6.25
Standard Discrete Probability Distributions (Finite)
n1: Search the random number 'y' in the column of cumulative probabilities. Find
cumulative probability which is just bigger
the than or equal to y. In other words, if we
denotethecumulative probabilities by C, then
C<y <= C, for some i.
Oansider the X value corresponding to the larger cumulative probability. That is, the
Glhe of Xcoresponding to C; 1s an observation selected in the sample.
Rion 4: Repeat the procedure N times. You will get a random sample
(X, Xz, ..., XE) from B(n, p).
Exanmple 6.10 : Draw a model sample of size 10 from a binomial distribution with
parameters n=8 and p=0.4,
Model Sampling from B(n =8,p= 0.4)
Table 6.1
X P(X<= x) Sr. No. Random Number Random Sample
0 0.01679616 0.538939128 3
0.10637558 2 0.710906892 4

2 0.31539456 3 0.873977369 5

3 0.5930864 4 0.418837163
4 0.8263296 5 0.717595904
5 0.95019264 6 0.981714158
6 0.99148032 7 0.904301866
7 0.99934464 8 0.699300478 4
8 9 0.42897 1576 3
10 0.685586378 4

6.22 Hypergeometric Distribution (April 2012)


We have noted earlier that binomial distribution is applied whenever we draw a random
sample with replacement. This is because, in sampling with replacement, the probability of
getting 'success p. remains same at every draw. Also, the successive draws- remain
independent. Thus, the assumptions of binomial experiment are satisfied. Now, consider the
following situation.
Abag contains 4 red and 5 black balls. Suppose 3 balls are drawn at random from this
bag without replacement and we are interested in the number of red balls drawn. Clearly at
4
the first draw, probability of getting ared ball is Now, suppose ared ball is selected at the
irst draw. Because, it would be kept aside, the probability of getting ared ball at the second
draw would be 3
R. Thus p does not remain constant. Also, the successive draws are not
Independent, Probability of getting red balls in the second draw is dependent on which ball

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy