Mathematical Expectation and Others

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Mathematical Expectation

If 𝑋 is a discrete random variable with the probability function 𝑓(𝑥), then the expected value or
the mathematical expectation of 𝑋 , 𝐸(𝑋) is defined as
𝐸(𝑋) = ∑𝑥 𝑥𝑓(𝑥).
If 𝑋 is continuous having a density function 𝑓(𝑥), then

𝐸(𝑋) = ∫−∞ 𝑥𝑓(𝑥)𝑑𝑥.

*If C is a constant, 𝐸(𝐶) = 𝐶.

*𝐸(𝐶 × 𝑔(𝑥)) = 𝐸(𝐶) × 𝐸(𝑔(𝑥)) = 𝐶 × 𝐸(𝑔(𝑥))

*𝐸[𝑊1 (𝑋) + 𝑊2 (𝑋) + ⋯ + 𝑊𝑘 (𝑋)] = 𝐸[𝑊1 (𝑋)] + 𝐸[𝑊2 (𝑋)] + ⋯ + 𝐸[𝑊𝑘 (𝑋)]
∑𝑥 𝑊(𝑥)𝑓(𝑥), 𝑖𝑓 𝑋 𝑖𝑠 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒
*𝐸[𝑊(𝑋)] = { ∞
∫−∞ 𝑊(𝑥)𝑓(𝑥)𝑑𝑥, 𝑖𝑓 𝑋 𝑖𝑠 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠

*The Variance of a random variable 𝑋 is


𝑉(𝑋) = 𝐸[(𝑋 − 𝜇)2 ]
= 𝐸(𝑋 2 ) − 𝜇 2 ,
where 𝜇 is the expected value.

* 𝑋 ∶ −3 − 2 0 1 2
 𝑃(𝑋) = 𝑓(𝑥) : .10 .30 .15 .40 .05
Find 𝐸(𝑋) and 𝑉(𝑋).
Solution:

𝜇 = 𝐸(𝑋) = ∑ 𝑥𝑓(𝑥)
𝑥

= (−3 × .10) + (−2 × .30) + (0 × .15) + (1 × .40) + (2 × .05)


= −0.4
2

𝑉(𝑋) = 𝐸(𝑋 − 𝜇) = ∑(𝑋 − 𝜇)2 𝑓(𝑥)


2

−3

= (−3 + .4)2 × .1 + (−2 + .4)2 × .30 + ⋯ + (2 + .4 )2 × .05


= 2.54
Or, 𝐸(𝑋 2 ) = ∑𝑥 𝑥 2 𝑓(𝑥)
= (−3)2 × .10 + (−2)2 × .30 + ⋯ + 22 × .05
= 2.7

∴ 𝑉(𝑋) = 𝐸(𝑋 2 ) − 𝜇 2 = 2.7 − (−.4)2 = 2.54

□ Standard deviation (𝜎) is the square root of the variance, i.e.

𝜎 = √𝑉(𝑋) = √2.54 = 1.59.

*A life insurance company in Bangladesh offers to sell a TK.25000 one-year term life insurance
policy to a 25 year-old man for a premium of TK. 2500. According to Bangladesh life table, the
probability of surviving one year for a 25 year old man is 0.97. What is the company’s expected
gain in the long run?
Solution:
The gain 𝑋 is a random variable that may take on the values 2500, if the man survives or 2500-
25000= -Tk. 22500 if he dies. Consequently, the probability of X is
𝑋 ∶ 2500 − 22500
𝑓(𝑥) ∶ .97 .03
So, 𝐸(𝑋) = (2500 × .97) + (−22500 ×. 03) = 1750
Thus the ultimate gain of the company is 1750.

*Find the mean (expected value) and variation of the following function
𝑓(𝑥) = 2(1 − 𝑥), 0 < 𝑥 < 1.

1 1 1 1
Solution: 𝐸(𝑋) = ∫0 𝑥. 2(1 − 𝑥)𝑑𝑥 = 2 [∫0 𝑥 𝑑𝑥 − ∫0 𝑥 2 𝑑𝑥] = 3
1 1
𝐸(𝑋 2 ) = ∫0 𝑥 2 . 2(1 − 𝑥)𝑑𝑥 = 6

1 1 2 1
∴ 𝑉(𝑋) = 𝜎 2 = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2 = 6 − (3) = 18
□Expected value of a function of two random variables:
∑𝑥 ∑𝑦 𝑊(𝑥, 𝑦)𝑓(𝑥, 𝑦), 𝑖𝑓 𝑋 𝑎𝑛𝑑 𝑌 𝑎𝑟𝑒 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒
𝐸[𝑊(𝑋, 𝑌)] = { ∞
∬−∞ 𝑊(𝑥, 𝑦)𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦, 𝑖𝑓 𝑋 𝑎𝑛𝑑 𝑌 𝑎𝑟𝑒 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠

*𝐸(𝑎𝑋 + 𝑏𝑌) = 𝐸(𝑎𝑋) + 𝐸(𝑏𝑌) = 𝑎𝐸(𝑋) + 𝑏𝐸(𝑌)

*𝐸[∑𝑘𝑖=1 𝑎𝑖 𝑋𝑖 ] = ∑𝑘𝑖=1 𝑎𝑖 𝐸(𝑋𝑖 )


* If X and Y are independent, then
i. f(x,y)=f(x)f(y)
ii. E(X.Y)=E(X)E(Y)
iii. E[g(x)h(y)]=E[g(x)]E[h(y)]

* Consider the following joint distribution of X and Y and find E(X), E(Y), E(X+Y) and E(X.Y).

X\Y -3 2 4 Total
1 .1 .2 .2 .5
3 .3 .1 .1 .5
Total .4 .3 .3 1

Solution:
E(X)=∑𝑥 ∑𝑦 𝑥𝑓(𝑥, 𝑦) = ∑𝑥 𝑥𝑔(𝑥) ,g(x) is the marginal distribution of X.

Thus,
E(X) = 𝑥1 𝑔(𝑥1 ) + 𝑥2 𝑔(𝑥2 )
=(1× 0.5) + (3 × .5)
=2
Similarly,
E(Y)=∑𝑦 𝑦ℎ(𝑦) = (−3 × .4) + (2 × .3 ) + (4 × .3) = .6

𝐸(𝑋 + 𝑌) = ∑𝑥 ∑𝑦(𝑥 + 𝑦)𝑓(𝑥, 𝑦) = (1 − 3) × .1 + (1 + 2) × .2 + (1 + 4) × .2 + ⋯ + (3 +


4) × .1
=2.6
x+y : -2 0 3 5 7
f(x,y): .1 .3 .2 .3 .1
∴ 𝐸(𝑋 + 𝑌) = (−2 × .1) + (0 × .3) + ⋯ + (7 × .1) = 2.6
𝑂𝑟, 𝐸(𝑋 + 𝑌) = 𝐸(𝑋) + 𝐸(𝑌) = 2 + .6 = 2.6
Now,
xy : -9 -3 2 4 6 12
f(x,y): .3 .1 .2 .2 .1 .1
∴ 𝐸(𝑋𝑌) = (−9 × .3) + (−3 × .1) + ⋯ + (12 × .1) = 0

* Consider
1
P(x,y)=8 (6 − 𝑥 − 𝑦), 0 < 𝑥 < 2, 2 < 𝑦 < 4

Find E(X), E(Y), E(XY). Are X and Y independent?


Solution:
The marginal density of X is
4
1 1
𝑔(𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦 = ∫ (6 − 𝑥 − 𝑦)𝑑𝑦 = (3 − 𝑥)
𝑦 2 8 4
2
1 5
∴ 𝐸(𝑋) = ∫ 𝑥𝑔(𝑥)𝑑𝑥 = ∫ 𝑥. (3 − 𝑥)𝑑𝑥 =
𝑥 0 4 6
Similarly, marginal density of Y is
2
1 1
ℎ(𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 = ∫ (6 − 𝑥 − 𝑦)𝑑𝑥 = (5 − 𝑦)
𝑥 0 8 4
4 4
1 17
∴ 𝐸(𝑌) = ∫ 𝑦ℎ(𝑦)𝑑𝑦 = ∫ 𝑦 (5 − 𝑦)𝑑𝑦 =
2 2 4 6
2 4
1 7
𝐸(𝑋𝑌) = ∫ ∫ 𝑥𝑦𝑓(𝑥, 𝑦)𝑑𝑦𝑑𝑥 = ∫ ∫ 𝑦 (6 − 𝑥 − 𝑦)𝑑𝑦𝑑𝑥 =
𝑥 𝑦 0 2 8 3
5 17
Now, E(X)E(Y)=6 × ≠ 𝐸(𝑋𝑌)
6

Therefore, X and Y are not independent.


Moment Generating Function
Let X be a random variable with the probability density function f(x). Then the function 𝑀𝑋 (𝑡),
called the moment generating function (MGF) of the random variable X is defined by
𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑡𝑋 ) = ∑𝑒 𝑡𝑥 𝑓(𝑥), 𝑖𝑓 𝑋 𝑖𝑠 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒

= ∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥)𝑑𝑥, 𝑖𝑓 𝑋 𝑖𝑠 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠.

*Moment about an arbitrary point:


rth moment about an arbitrary point “a”:
𝑉𝑟 = 𝐸(𝑋 − 𝑎)𝑟
Moment about the mean:
rth moment about the mean is: (corrected moment)
𝜇𝑟 = 𝐸(𝑋 − 𝜇)𝑟
𝜇2 = 𝐸(𝑋 − 𝜇)2 = 𝜎 2
Moment makes distinction between different distributions.

Moment about origin:


𝜇𝑟′ = 𝐸(𝑋)𝑟
𝜇1′ = 𝐸(𝑋) = 𝜇
So, 1st raw moment is the mean, 2nd corrected moment is variance.

*Let’s see how 𝑀𝑋 (𝑡) works ;

𝑥
𝑥2
𝑒 =1+𝑥+ +⋯
2!
(𝑡𝑋)2 (𝑡𝑋)3 (𝑡𝑋)𝑟
𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑡𝑋 ) = 𝐸[1 + 𝑡𝑋 + + + ⋯+ +⋯]
2! 3! 𝑟!
𝑡2 𝑡𝑟
= 1 + 𝑡𝐸(𝑋) + 𝐸(𝑋 2 ) + ⋯ + 𝐸(𝑋 𝑟 ) + ⋯
2! 𝑟!
𝑡2 ′ 𝑡𝑟
= 1 + 𝑡𝜇1′ + 𝜇2 + ⋯ + 𝜇𝑟′ + ⋯
2! 𝑟!
𝑡𝑟
=∑∞
𝑟=0 ( 𝑟! ) 𝜇𝑟 ′

𝑡𝑟
So, the coefficient of is the rth raw moment
𝑟!
𝑡
Coefficient of = 𝜇1′
1!

𝑡2
Coefficient of = 𝜇2′
2!

𝑡𝑟
Coefficient of = 𝜇𝑟′
𝑟!

We also get moment from 𝑀𝑋 (𝑡) by differentiating 𝑀𝑋 (𝑡) with respect to t and then setting t=0 ,
i.e. ,
𝑑𝑟
𝜇𝑟′ = 𝑟 𝑀𝑋 (𝑡)|𝑡 = 0
𝑑𝑡
𝑑 𝜇3′ 𝑡 2
Thus, 𝑑𝑡 𝑀𝑋 (𝑡) = 𝜇1′ + 𝜇2′ 𝑡 + +⋯
2!
𝑑
Setting t=0, 𝑀𝑋 (0) = 𝜇1′ (= 𝐸(𝑋)).
𝑑𝑡

* Let X be a random variable with MGF 𝑀𝑋 (𝑡) , Let 𝑌 = 𝑎𝑋 , Where a is any constant then ,
𝑀𝑌 (𝑡) = 𝑀𝑋 (𝑎𝑡)
Proof: 𝑀𝑌 (𝑡) = 𝐸(𝑒 𝑌𝑡 ) = 𝐸(𝑒 𝑡𝑎𝑋 ) = 𝐸(𝑒 𝑎𝑡𝑋 ) = 𝑀𝑋 (𝑎𝑡).

* Let MGF of X be 𝑀𝑋 (𝑡), and 𝑌 = ∑𝑛𝑖=1 𝑋𝑖 = 𝑋1 + 𝑋2 + ⋯ 𝑋𝑛


Then 𝑀𝑌 (𝑡) = ∏𝑛𝑖=1 𝑀𝑋𝑖 (𝑡), if X ’s are independent.

Proof:
𝑀𝑌 (𝑡) = 𝐸(𝑒 𝑌𝑡 ) = 𝐸(𝑒 𝑋1𝑡+𝑋2 𝑡+⋯+𝑋𝑛𝑡 ) = 𝐸(𝑒 𝑋1 𝑡 𝑒 𝑋2 𝑡 𝑒 𝑋3 𝑡 … 𝑒 𝑋𝑛𝑡 )
= 𝐸(𝑒 𝑋1𝑡 )𝐸(𝑒 𝑋2 𝑡 ) … 𝐸(𝑒 𝑋𝑛𝑡 ) [As X’s are independent]
𝑛

= 𝑀𝑋1 (𝑡). 𝑀𝑋2 (𝑡) … 𝑀𝑋𝑛 (𝑡) = ∏ 𝑀𝑋𝑛 (𝑡)


𝑖=1
𝑒 −𝑥 , 𝑓𝑜𝑟 𝑥 > 0
* 𝑓(𝑥) = {
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Determine the MGF and hence the variance of X.
Solution:
For any real number t,

𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑡𝑋 ) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥)𝑑𝑥
0

= ∫ 𝑒 𝑡𝑥 𝑒 −𝑥 𝑑𝑥
0

=∫ 𝑒 (𝑡−1)𝑥 𝑑𝑥
0

ⅇ (𝑡−1)𝑥
=[ ] [𝑖𝑓 𝑡 < 1]
𝑡−1 0

Now,
𝑑 1
𝐸(𝑥) = 𝑀𝑥 (𝑡) = = 1, [𝑡 = 0]
𝑑𝑥 (1 − 𝑡)2
𝑑2 1
𝐸(𝑥 2 ) = 𝑑𝑥 2 𝑀𝑥 (𝑡) = (1−𝑡)3 = 2, [𝑡 = 0]
2
So, 𝑉(𝑥) = 𝐸(𝑥 2 ) − (𝐸(𝑥)) = 2 − 12 = 1
OR,
1 2 𝑟
𝑡 𝑡2 𝑡3
= 1 + 𝑡 + 𝑡 + ⋯ + 𝑡 + ⋯ = 1 + 1 ⋅ + 2! + 3! ⋅ + ⋯
1−𝑡 1! 2! 3!
𝑆𝑜 , 𝐸(𝑥 𝑟 ) = 𝑟!
So, E(X) = 1,
E (𝑋 2 ) = 2. (As before)
Cumulant Generating Function
If the logarithm of moment generating function of a distribution can be expanded as

𝐾𝑥 (𝑡) = 𝑙𝑛(𝑀𝑥 (𝑡))

𝑡2 𝑡𝑟
= 𝑘1 𝑡 + 𝑘2 + ⋯ + 𝑘𝑟 + ⋯
2! 𝑟!
Then the coefficient 𝑘𝑟 is called the 𝑟 𝑡ℎ cumulant of the distribution and 𝐾𝑥 (𝑡) is called the
cumulant generating function (KGF).

The 𝑟 𝑡ℎ cumulant 𝑘𝑟 can be obtained by differentiating 𝐾𝑥 (𝑡) , r times with respect to t and setting
t=0. Thus,
𝑑𝑟
𝑘𝑟 = [ 𝑑𝑥 𝑟 ln(𝑀𝑥 (𝑡)) ], where, t=0

*Relationship between moment and cumulants:


𝐾1 = 𝜇1′
𝐾2 = 𝜇2′ + (𝜇1′ )2 = 𝜇2
𝐾3 = 𝜇3′ + 3 𝜇2′ . 𝜇1′ + 2(𝜇1′ )3 = 𝜇3
𝐾4 = … = 𝜇4 -3(𝜇2′ )2

--------------
Relationship between raw moment and corrected moment:
𝜇1 = 0
𝜇2 = 𝜇2′ + (𝜇1′ )2
𝜇3 = 𝜇3′ + 3 𝜇2′ . 𝜇1′ + 2(𝜇1′ )3
𝜇4 = 𝜇4′ − 4 𝜇3′ . 𝜇1′ + 6𝜇2′ . (𝜇1′ )2 −3(𝜇1′ )4
Binomial Distribution
When an experiment has two possible outcomes, success and failure and the experiment is repeated
𝑛 times independently and the probability 𝑃 of success of any given trial remains constant from
trial to trial, the experiment is known as binomial experiment.

If X is a binomial random variable, then


f(x) ~ Binomial (𝑛, 𝑝)
f(x) = n∁r ∙ 𝑃x (1 − 𝑛)n-x , 𝑥 = 0, 1, 2, … … , 𝑛
↳probability mass function

where,
n: number of trials
p: probability of success
[ f(x) = P(X=x)]

Moments of Binomial Distribution:


𝑛
𝜇 1’=E(x)= ∑𝑛𝑥=0 𝑥𝑓(𝑥) = ∑ (𝑛𝑘) 𝑃 𝑥 (1 − 𝑃)𝑛−𝑥 = ⋯ = 𝑛𝑝
𝑥=0

𝜇 2’=𝐸(𝑋 2 ) = ⋯ = 𝑛𝑝𝑞 + 𝑛2 𝑝2

Example: The probability that a patient recovers from a diseases is 0.9. What is the probability that
an exactly 5 out of the next seven patients will survive?

Solution:
We assure that the operations are made independently and P=0.9 for each of the seven
patients.
n=7, p=0.9
patient1 p2 p3 p4 p5 p6 p7 Total
1: success(recover) 1 1 0 0 0 1 0 3
2: failure(not recover) 0 0 0 0 1 0 0 1

Bernoulli Binomial

So, X~ Binomial (n=7, p=0.9)


𝑛
P(X=x) = ( ) 𝑝 𝑥 (1 − 𝑝)𝑛−𝑥 𝑝(𝑥 = 5)
𝑥
7
= ( ) ⋅ (9)5 ⋅ (1 − 9)7−5 = 0.1240
5

Example:
A traffic control officer reports that 75% of the trucks passing through a check post are from within
Dhaka city. What is the probability that at least 3 of the next 5 trucks are from out of the city?
Solution: Let X be the number of trucks that pass through are from out of Dhaka city. The
probability of such an event is: p=1-0.75=0.25
X~ Bin (n=5, p=0.25)

P(X≥3) =∑53(𝑛𝑥) 𝑝 𝑥 (1 − 𝑝)(𝑛−𝑥)

=(53)(0.253 )(0.755−3 )+(54)0.254 0.755−4

=0.1035
Or, p(X≥3) =1-p(X=0)-p(X=1)-p(X=2)

=1-(50). 250 . 755−0 − (51)0. 251 0.755−1 − (52). 252 . 755−2

*Probability of more than 3 trucks are out of the city:


P(x>3) =P(X=4) +P(x+5)

=(54)(. 254 )(. 755−4 ) + (55)(. 255 )(. 755−5 )

= 0.2441
*Probability that at least 2 trucks are out of the city:
5
P(X≤2) = 𝑃(𝑋 = 0) + 𝑃(𝑋 = 1) + 𝑃(𝑋 = 2) = ( ) (0.25)0 (0.75)5−0 + ⋯
0
*Probability that less than 2 trucks are out of the city:
P(X<2) = P(X=0) + P(X=1)
*P(1≤X<3) = P(X=1) + P(X=2) + P(X=3)
*P(1<X<3) = P(X=2)
−O−
Mean of binomial random variable: E(X) = np
Variance of binomial random variable: V(X) = np(1-P)

If E(X)=5 , V(X)=2 , Then , np=5


np(1-p)=2

𝑛𝑃(1−𝑃) 2
=5
𝑛𝑃
2
or, 1-P = 5
2 3
or, P = 1- 5 = 5

Now, nP = 5
5 5×5 25
or, 𝑛 = 𝑃 = =
3 3
25 3
X~𝐵𝑖𝑛(𝑛 = , 𝑃 = 5)
3
Poisson Distribution

Let µ be the mean of successes in a specified time or space and the random variable X is the
number of successes in a given time interval or specified region. Then X follows Poisson
distribution as
ⅇ −µ µ𝑥
f(x) = , x=0,1, ...∞
𝑥!

where e= 2.718.

□ Poisson distribution is used for count data. [µ = mean = variance]

Example: The average number of calls received by a telephone operator during a time interval of
10 minutes. during 5PM to 5:10 PM daily is 3. What is the prob. that the operator will receive,
i. no call
ii. exactly one call
iii. at least two calls tomorrow during the same time interval.

Soln: Let X be the random variable representing the number of calls made during the interval.
X ~ Poisson (3),
ⅇ −3 3𝑥
f(X=x) = , x≥0
𝑥!

ⅇ −3 30
i. P(X=0) = = 0.0498
0!

𝑒 −3 ∙ 3
ii. 𝑃(𝑋 = 1) = = 0.1494
1!
iii. 𝑃(𝑋 ≥ 2) = 1 − 𝑃(𝑋 = 0) − 𝑃(𝑥 = 1)
= 1 − 0.0498 − 0.1494
= 0.8008
Mean:

𝐸(𝑋) = ∑ 𝑥 ∙ 𝑓(𝑋 = 𝑥) = . . . = 𝜇
𝑋=0

Variance:
𝑉(𝑋) = 𝐸[𝑋 − 𝐸(𝑋)]2 = ⋯ = 𝜇
Poisson approximation to binomial distribution:
If 𝑛 becomes larger and p becomes smaller, then the Poisson distribution with
𝜇 = 𝑛𝑝 provides an approximation to the binomial distribution.

Example: The probability of breaking a glass beaker while heating in the laboratory is 0.012. If we heat
250 such glass beakers, find the probability that there will be exactly 2 breaks.

Solution:
Let X be the number of breakage
X ~ Binomial (n = 250, p = 0.012)
Now, 𝑃(𝑋 = 2) = (250 2
2 )(0.012) (1 − 0.012)
(250−2)

= 0.2245

Since n is large and p is small, we may obtain an approximate probability by using the Poisson
distribution with 𝜇 = 𝑛𝑝 = 250 × 0.012 = 3.

𝑒 −3 ∙ 32
𝑃(𝑋 = 2) = = 0.2241
2!

 Find the mean and standard deviation of a Poisson variate X for which P(X=1) = P(X=2).
Solution:
Let X ~ Poisson(𝜇)
𝑒 −𝜇 ∙ 𝜇1
𝑃(𝑋 = 1) =
1!
𝑒 −𝜇 ∙ 𝜇 2
𝑃(𝑋 = 2) =
2!
𝑒 −𝜇 ∙ 𝜇1 𝑒 −𝜇 ∙ 𝜇 2
∴ =
1! 2!
⇒𝜇=2
So, the mean = 2, Standard deviation = 2.
Normal Distribution

A random variable X is said to have a normal distribution with mean µ and variance 𝜎 2 (-∞ < µ <
∞ and 𝜎 2 > 0) if X has a continuous distribution for which the probability density function is
1 𝑥−𝜇 2
1
f(x) = 𝜎√2𝜋 𝑒 −2( 𝜎
)
, -∞ < x < ∞

*Normal probability law:


P (µ-3σ < X < µ+3σ) = 99.73%

P (µ-2σ < X < µ+2σ) = 99.45%

P (µ-σ < X < µ+σ) = 68.27%

 Standard Normal Distribution:

If a random variable X has a normal distribution with mean µ and variance σ2 (i.e. X ~ N (µ ,
𝑋−𝜇
σ2)), then the variable Z = will be called a standard normal variable ( or Z score ) and its
𝜎

distribution is referred to as the standard normal distribution having the following density
function :

𝑍2
1 −
f(Z) = 𝑒 2 , -∞ < Z < ∞
√2𝜋

𝑋−𝜇 1 1
E(Z) = E ( ) = 𝜎 𝐸(𝑋 − 𝜇) = [ 𝐸(𝑋) − 𝐸(𝜇) ]
𝜎 𝜎

1
= [𝜇− 𝜇]
𝜎

=0

𝑋−𝜇 1 1
𝑣(𝑋)= 𝑣( ) = 𝜎2 𝑣(𝑋 − 𝜇) = [ 𝑣(𝑋) + 𝑣(𝜇) − 2𝑐𝑜𝑣[𝑋]
𝜎 𝜎2

1
= [𝜎 2 + 0 − 0]
𝜎2

=1
Thus, Z ~ N (0,1)

The cumulative distribution function (cdf) of the standard normal variable Z is usually denoted
by φ(Z). Thus

𝑍 𝑍2
1
φ(Z)= ∫ 𝑒 − 2 𝑑𝑍
√2𝜋
−∞

 The GPA score of 80 students of the Department of Physics of University of Dhaka in their
1st year final exam was found to follow approximately a normal distribution with mean 2.1
and standard deviation 0.6. How many of these students are expected to have a score
between 2.5 and 3.5?

𝑺𝒐𝒍𝒏 : Let X be the GPA score of the 80 students.

𝑋 ~ 𝑁(2.1,0.6)

2.5−2.1 𝑋−𝜇 𝑋−2.1


Now, P (2.5 < X < 3.5) = P ( < < )
0.6 𝜎 0.6

= P (0.67 < 𝑍 < 2.33 )

= P (𝑍 < 2.33) – P (𝑍 < 0.67 )

=
0.9901 – 0.7480 = 0.2421
Hence, 24.21% or approximately 0.2421×80 = 20 students out of 80 are expected to make a score
between 2.5 and 3.5.

★ If X is a normal variate with mean 25 and variance 9, find K such that


i. 30% of the area under the normal curve lies to the left of the distribution.
ii. 15% of the area under the normal curve lies to the right of the distribution.
Solution:
X ~ N (25,9)
i.e. µ= 25, σ= 3
i. P(X<K)=0.30
=> P(Z<(𝐾 − 25)/3)= 0.30
The standard normal Table shows that
P(Z< -0.525)= 0.30
Hence
(𝐾 − 25)/3= -0.525
=> K= 23.425

ii. P(X>K) = 0.15

𝑘−25
or, P(Z> ) = 0.15
3
𝑘−25
or, P(Z< ) = 1- 0.15 = 0.85
3

*From standard normal approximation to the binomial:


The normal approximation for the binomial distribution is used if nP and nP(1-P) are both
sufficiently large.

𝜇 = 𝑛𝑃, 𝜎 = √𝑛𝑃(1 − 𝑃)
If, X~𝐵𝑖𝑛(𝑛, 𝑝), ie, E(X) = np, V(X) = nP(1-P) then,
𝑥 − 𝑛𝑃
𝑧= , 𝑖𝑓 𝑛 → ∞
√𝑛𝑃(1 − 𝑃)

Continuity Correction
P (a ≤ X ≤ b) ≈ P(a-0.5<𝑋 ′ <b+0.5)
Where 𝑋 ′ is the constant normal variable
*Let, X~Bin (n=10, P=0.5). Find P(2≤X≤4) using both
binomial distribution and normal distribution:

Solution:
Using binomial distribution:
P(2≤X≤4)
𝑛
=∑ ( ) 𝑃 𝑥 (1 − 𝑃)𝑛−𝑥
𝑥 𝑥
4
10 (0
=∑ ( ) ⋅ 5)𝑥 (1 − 0.5)10−𝑥 = 0.3662
𝑥=2 𝑥
Now, P(2≤X≤4) ≈ P(2-0.5≤X’≤4+0.5) = P(1.5≤X’≤4.5)
Using normal distribution, µ=np=5, 𝜎 2 =np(1-p) =2.5
𝑃(1.5 ≤ 𝑋 ′ ≤ 4.5)
1⋅5−5 4⋅5−5
=𝑃( ≤𝑧≤ )
√2⋅5 √2⋅5

= 𝑃(−2.21 ≤ 𝑍 ≤ −0.316)
= 𝑃(𝑍 ≤ −0.316) − 𝑝(𝑍 ≤ −0.21)
= 0.3764 − 0.0136
= 0.3628
The End

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy