Class 6

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Introduction to

Probability
and Statistics

Syllabus
Introduction to Probability and Statistics
Course ID:MA2203

Lecture-6
Course Teacher: Dr. Manas Ranjan Tripathy

Department of Mathematics
National Institute of Techonology, Rourkela

. . . . . . . . . . . . . . . . . . . .
1/9
Introduction to
Probability
and Statistics
Syllabus

Syllabus

Mid-Semester: Axiom of Probability(Motivation, Various types


of definitions), Some basic results on probability (Boole’s and
Bonferoni’s inequalities), Conditional probability, Bayes the-
orem, Probability for independent events, Random variable,
types of random variables(discrete and continuous), Cumu-
lative distribution function, probability mass function, prob-
ability density function, Mean, Variance, Standard deviation,
Moments (central and about origin), Moment generating func-
tion, Special types of random variables, discrete: uniform,
binomial, geometric, Poisson, Hyper geometric, Continuous:
Uniform, Normal, Gamma, Exponential (one parameter).

. . . . . . . . . . . . . . . . . . . .
2/9
Introduction to
Probability
and Statistics

Syllabus
After Mid-Semester: Two dimensional random variable, joint
CDF, joint PDF/PMF, Marginal distribution, Conditional distri-
bution, Calculating probabilities using two dimensional ran-
dom variable, Sampling (with and without replacement), Dis-
tribution of sample mean and variance in the case of nor-
mal distribution, Estimation: Point estimation, Method of mo-
ments, method of maximum likelihood, Confidence Intervals
for mean and variance in the case of normal, Testing of hy-
pothesis (parameters of normal distribution), Goodness of fit
Chi-square test, Regression and Correlation analysis, Rank
correlation coefficient.

. . . . . . . . . . . . . . . . . . . .
3/9
Introduction to
Probability
and Statistics Moment Generating Function(MGF): Let X be a random variable de-
fined on a sample space S. We denote the MGF of X having some
distribution as MX (t) and is defined as
Syllabus
∑ tx
MX (t) = E(etX ) = e j P(X = xj ), if X is Discrete
j
∫ ∞
= etx f (x)dx, if X is Continuous.
−∞

( t 2X 2 tr X r )
MX (t) = E(etX ) = E 1 + tX + + ··· + + ...
2! r!
t2 tr
= 1 + E(X ) + E(X ) + · · · + E(X r ) + . . .
2
2! r!
∑∞
tr ′
= µr ,
r!
r =0

where µ′r = E(X ) is the r th moment about origin of X .


r

tr
That is, we see that the coefficient of r!
in MX (t) gives the r th moment
about origin that is µ′r .

. . . . . . . . . . . . . . . . . . . .
4/9
Introduction to
Probability
and Statistics
Further differentiating MX (t) with respect to t r times and putting
t = 0, we get
Syllabus
dr µ′ t2
r
[MX (t)]t=0 = [ r r ! + µ′r +1 t + µ′r +2 + . . . ]t=0 = µ′r .
dt r! 2!
The moment generating function of X about any point ‘a’ is given by

M(X =a) (t) = E(et(X −a) )


t2
= E[1 + t(X − a) + (X − a)2 + . . .
2!
tr
+ (X − a)r + . . . ].
2!
Since it generates the moments, it is known as the moment gener-
ating function.
The moment generating function if exists is unique. That is for a
given random variable X , we can find its moment generating func-
tion uniquely and vice versa. In another word, we can say that the
moment generating function characterizes the distribution function.

. . . . . . . . . . . . . . . . . . . .
5/9
Introduction to Relation between Moment about origin (µ′r ) and Central Moments(µr ):
Probability
and Statistics µk = E(X − µ)k = C(k , 0)µ′k − C(k , 1)µµ′k −1
+C(k , 2)µ2 µ′k −2 − · · · + (−1)k µk .
Syllabus

µ′k = E(X k ) = E(X − µ + µ)k = C(k , 0)µk + C(k , 1)µµk −1


+C(k , 2)µ2 µk −2 + · · · + µk .
Exercises:(i) Let the random variable X has the pmf P(X = r ) =
q r −1 p; r = 1, 2, 3 . . . Find the mgf of X and hence its mean and
variance. 0 < p, q < 1, p + q = 1.
Ans:
∑∞
MX (t) = E(etX ) = etr q r −1 p
r =1

p∑

= (qet )r
q
r =1

p t∑

= qe (qet )r −1
q
r =1
( pet )
= pet [1 + qet + (qet )2 + . . . ] = .
1 − qet
. . . . . . . . . . . . . . . . . . . .
6/9
Introduction to (ii) The probability density function of the random variable X is given
Probability
and Statistics by
1 ( |x − θ| )
f (x) = exp − , −∞ < x < ∞.
Syllabus 2θ θ
Finf the moment generating function of X , and hence mean and vari-
ance.
Ans:
∫ ∞ ( |x − θ| )
1
MX (t) = E(etX ) = exp − etx dx
−∞ 2θ θ
∫ θ ( (θ − x) )
1
= exp − etx dx
−∞ 2θ θ
∫ ∞ ( (x − θ) )
1
+ exp − etx dx
θ 2θ θ
∫ θ
1 1
= exp{x(t + )}dx
2θe −∞ θ
∫ ∞
e 1
+ exp{−x( − t)}dx
2θ θ θ
eθt eθt
= +
2(θt + 1) 2(1 − θt)
eθt
= = eθt (1 − θ2 t 2 )−1
1 .− .θ2. t .2 . . . . . . . . . . . . . . . .
7/9
Introduction to
Probability Hence
and Statistics

θ2 t 2
MX (t) = eθt (1 − θ2 t 2 ) = (1 + θt + + . . . )(1 + θ2 t 2 + θ4 t 4 + . . .
Syllabus 2!
3t 2 θ2
= 1 + θt + + ...
2!
Mean of X is the coefficient of t in MX (t) which is θ. That is µ′1 = θ,
2
µ′2 = coefficient of t2 in MX (t) which is 3θ2 . Hence variance of X ,
V (X ) = µ′2 − (µ′1 )2 = 3θ2 − θ2 = 2θ2 .
(iii) If the moments of a random variable X are given by E(X r ) = 0.6;
r = 1, 2, 3, . . . show that P(X = 0) = 0.4, P(X = 1) = 0.6, P(X ≥
2) = 0.
Ans: The moment generating function is given by


tr ′ ∑

tr
MX (t) = µr = 1 + (0.6)
r! r!
r =0 r =1


tr ∑

tr
= 0.4 + 0.6 + (0.6) = 0.4 + 0.6[ ]
r! r!
r =1 r =0

= 0.4 + 0.6et .

. . . . . . . . . . . . . . . . . . . .
8/9
Introduction to
Probability
and Statistics

Syllabus

But we have,


MX (t) = E(etX ) = etx P(X = x)
x=0


= P(X = 0) + et P(X = 1) + etx P(X = 2)
x=2

Comparing these two, we get the required result that is P(X = 0) =


0.4, P(X = 1) = 0.6 and P(X ≥ 2) = 0.

. . . . . . . . . . . . . . . . . . . .
9/9

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy