0% found this document useful (0 votes)
4 views39 pages

Cosm Unit II

The document discusses mathematical expectation, joint probability mass functions, and properties of means and variances of random variables. It outlines formulas for calculating expectations, including linear combinations of random variables and their variances. Additionally, it presents proofs for the properties of expectation and variance in the context of discrete random variables.

Uploaded by

asaichaitanya123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views39 pages

Cosm Unit II

The document discusses mathematical expectation, joint probability mass functions, and properties of means and variances of random variables. It outlines formulas for calculating expectations, including linear combinations of random variables and their variances. Additionally, it presents proofs for the properties of expectation and variance in the context of discrete random variables.

Uploaded by

asaichaitanya123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

COSM: UNIT-II

𝑴𝒂𝒕𝒉𝒆𝒎𝒂𝒕𝒊𝒄𝒂𝒍 𝑬𝒙𝒑𝒆𝒄𝒕𝒂𝒕𝒊𝒐𝒏:
𝐿𝑒𝑡 𝑋 𝑏𝑒 𝑎 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑡ℎ𝑎𝑡 𝑡𝑎𝑘𝑒𝑠 𝑛 − 𝑣𝑎𝑙𝑢𝑒𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦
𝑋 = {𝑥1 , 𝑥2 , … . , 𝑥𝑛 } 𝑤𝑖𝑡ℎ 𝑟𝑒𝑠𝑝𝑒𝑐𝑡 𝑡𝑜 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑖𝑒𝑠 𝑃 = {𝑃1 , 𝑃2 , … . . , 𝑃𝑛 }. 𝑇ℎ𝑒
𝑛

𝑚𝑎𝑡ℎ𝑒𝑚𝑎𝑡𝑖𝑐𝑎𝑙 𝑒𝑥𝑝𝑒𝑐𝑡𝑎𝑡𝑖𝑜𝑛 𝑜𝑓 𝑋 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦 𝐸 (𝑥 ) = ∑ 𝑥𝑖 𝑃𝑖


𝑖=1
𝑛
2)
𝑆𝑖𝑚𝑖𝑙𝑎𝑟𝑙𝑦 𝐸 (𝑥 = ∑(𝑥𝑖 )2 𝑃𝑖
𝑖=1

𝑱𝒐𝒊𝒏𝒕 𝑷𝒓𝒐𝒃𝒂𝒃𝒊𝒍𝒊𝒕𝒚 𝑴𝒂𝒔𝒔 𝑭𝒖𝒏𝒄𝒕𝒊𝒐𝒏:


𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝑡𝑤𝑜 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒𝑠 𝑋 𝑎𝑛𝑑 𝑌 𝑤ℎ𝑒𝑟𝑒 𝑋 𝑡𝑎𝑘𝑒𝑠 𝑛 − 𝑣𝑎𝑙𝑢𝑒𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦
𝑋 = {𝑥1 , 𝑥2 , … . , 𝑥𝑛 } 𝑌 𝑡𝑎𝑘𝑒𝑠 𝑚 − 𝑣𝑎𝑙𝑢𝑒𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦 𝑌 = {𝑦1 , 𝑦2 , … . . , 𝑦𝑚 }. 𝑇ℎ𝑒 𝑗𝑜𝑖𝑛𝑡
𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑖𝑠 𝑑𝑒𝑛𝑜𝑡𝑒𝑑 𝑏𝑦 𝑝𝑖𝑗 = 𝑃[𝑋 = 𝑥𝑖 ∩ 𝑌 = 𝑦𝑗 ]

𝑝12 𝑚𝑒𝑎𝑛𝑠 = 𝑃[𝑋 = 𝑥1 ∩ 𝑌 = 𝑦2 ]


𝑃𝑟𝑜𝑝𝑒𝑟𝑡𝑖𝑒𝑠:
𝑛

1. ∑ 𝑝𝑖𝑗 = 𝑝𝑗 𝑓𝑜𝑟𝑠𝑜𝑚𝑒 𝑗
𝑖=1
𝑚

2. ∑ 𝑝𝑖𝑗 = 𝑝𝑖 𝑓𝑜𝑟𝑠𝑜𝑚𝑒 𝑖
𝑗=1

𝑴𝒆𝒂𝒏 𝒂𝒏𝒅 𝒗𝒂𝒓𝒊𝒂𝒏𝒄𝒆 𝒐𝒇 𝒍𝒊𝒏𝒆𝒂𝒓 𝒄𝒐𝒎𝒃𝒊𝒏𝒂𝒕𝒊𝒐𝒏𝒔 𝒐𝒇 𝒓𝒂𝒏𝒅𝒐𝒎 𝒗𝒂𝒓𝒊𝒂𝒃𝒍𝒆𝒔: −


Mean:
1) 𝐸 [𝑎𝑋] = 𝑎. 𝐸 [𝑋], 𝑎 ≠ 0 , 𝑎 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
(𝐸𝑥𝑝𝑒𝑐𝑡𝑎𝑡𝑖𝑜𝑛 𝑜𝑓 𝑎𝑋 𝑖𝑠 𝑒𝑞𝑢𝑎𝑙 𝑡𝑜 𝑎 𝑖𝑛𝑡𝑜 𝑒𝑥𝑝𝑒𝑐𝑡𝑎𝑡𝑖𝑜𝑛 𝑜𝑓 𝑋)
𝑛

𝑃𝑟𝑜𝑜𝑓: 𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝐸 [𝑎𝑋] = ∑ 𝑎𝑥𝑖 . 𝑝𝑖


𝑖=1

= 𝑎 ∑𝑛𝑖=1 𝑥𝑖 . 𝑝𝑖
= 𝑎. 𝐸 [𝑋]
𝐸𝑥: 𝐸 [2𝑥] = 2 𝐸 [𝑥 ].
2) 𝐸 [𝑎𝑋 + 𝑏] = 𝑎𝐸 [𝑋] + 𝑏 , 𝑎 & 𝑏 𝑎𝑟𝑒 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡𝑠.
𝑛

𝑃𝑟𝑜𝑜𝑓: 𝐸 [𝑎𝑋 + 𝑏] = ∑[𝑎𝑥𝑖 + 𝑏] . 𝑝𝑖


𝑖=1

= 𝑎 ∑𝑛𝑖=1 𝑥𝑖 . 𝑝𝑖 + 𝑏 ∑𝑛𝑖=1 𝑝𝑖
= 𝑎. 𝐸 [𝑋] + 𝑏 (𝑠𝑖𝑛𝑐𝑒 ∑𝑛𝑖=1 𝑃𝑖 = 1)
3) 𝐸 [𝑋 + 𝑌] = 𝐸 [𝑋] + 𝐸[𝑌]
𝐹𝑜𝑟 𝑎𝑛𝑦 𝑡𝑤𝑜 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒𝑠 𝑋 𝑎𝑛𝑑 𝑌, 𝐸 [𝑋 + 𝑌] = 𝐸 [𝑋] + 𝐸[𝑌]
𝐿𝑒𝑡 𝑋 𝑡𝑎𝑘𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒𝑠 {𝑥1 , 𝑥2 , … . , 𝑥𝑛 }, 𝑌 𝑡𝑎𝑘𝑒𝑠 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒𝑠 {𝑦1 , 𝑦2 , … . . , 𝑦𝑚 }
𝐿𝑒𝑡 𝑃𝑖𝑗 𝑏𝑒 𝑡ℎ𝑒 𝑗𝑜𝑖𝑛𝑡 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛
𝑛 𝑚

𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝐸 [𝑋 + 𝑌] = ∑ ∑[𝑥𝑖 + 𝑦𝑗 ] 𝑝𝑖𝑗


𝑖=1 𝑗=1

𝑛 𝑚 𝑛 𝑚

= ∑ ∑ 𝑥𝑖 𝑝𝑖𝑗 + ∑ ∑ 𝑦𝑗 𝑝𝑖𝑗
𝑖=1 𝑗=1 𝑖=1 𝑗=1

𝑛 𝑚 𝑛 𝑚

= ∑ 𝑥𝑖 𝑝𝑖 + ∑ 𝑦𝑗 𝑝𝑗 (𝑠𝑖𝑛𝑐𝑒 ∑ 𝑝𝑖𝑗 = 𝑝𝑗 & ∑ 𝑝𝑖𝑗 = 𝑝𝑖 )


𝑖=1 𝑗=1 𝑖=1 𝑗=1

∴ 𝐸 [𝑋 + 𝑌] = 𝐸 [𝑋] + 𝐸[𝑌]
4) 𝐸 [𝑋. 𝑌] = 𝐸 [𝑋]. 𝐸[𝑌]
𝐹𝑜𝑟 𝑎𝑛𝑦 𝑡𝑤𝑜 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒𝑠 𝑋 𝑎𝑛𝑑 𝑌 𝑤ℎ𝑖𝑐ℎ 𝑎𝑟𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡.
𝑋 = {𝑥1 , 𝑥2 , … . , 𝑥𝑛 } 𝑎𝑛𝑑 𝑌 = {𝑦1 , 𝑦2 , … . . , 𝑦𝑚 }
𝑛 𝑚

𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝐸 [𝑋. 𝑌] = ∑ ∑[𝑥𝑖 . 𝑦𝑗 ] 𝑝𝑖𝑗


𝑖=1 𝑗=1

𝑛 𝑚 𝑛 𝑚

= ∑ ∑ 𝑥𝑖 𝑝𝑖𝑗 . ∑ ∑ 𝑦𝑗 𝑝𝑖𝑗 (𝑠𝑖𝑛𝑐𝑒 𝑋 𝑎𝑛𝑑 𝑌 𝑎𝑟𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡)


𝑖=1 𝑗=1 𝑖=1 𝑗=1

𝑛 𝑚 𝑛 𝑚

= ∑ 𝑥𝑖 𝑝𝑖 . ∑ 𝑦𝑗 𝑝𝑗 (𝑠𝑖𝑛𝑐𝑒 ∑ 𝑝𝑖𝑗 = 𝑝𝑗 & ∑ 𝑝𝑖𝑗 = 𝑝𝑖 )


𝑖=1 𝑗=1 𝑖=1 𝑗=1

∴ 𝐸 [𝑋. 𝑌] = 𝐸 [𝑋]. 𝐸[𝑌]


𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 :
1. 𝐹𝑜𝑟 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑋, 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 (𝑉 ) 𝑤𝑒 ℎ𝑎𝑣𝑒 𝑉 [𝑎𝑥 + 𝑏] = 𝑎2 𝑉(𝑥)
𝐿𝑒𝑡 𝑌 = 𝑎𝑋 + 𝑏 … … (1)
𝐸[𝑌] = 𝐸[𝑎𝑋 + 𝑏]
𝐸 [𝑌] = 𝑎𝐸 [𝑋] + 𝑏 … … . (2)
(1) − (2) ⇒ 𝑌 − 𝐸 [𝑌] = [𝑎𝑋 + 𝑏] − [𝑎𝐸 [𝑋] + 𝑏 ]
⇒ 𝑌 − 𝐸 [𝑌] = 𝑎[𝑋 − 𝐸 [𝑋]]
𝑆𝑞𝑢𝑎𝑟𝑖𝑛𝑔 𝑜𝑛 𝑏𝑜𝑡ℎ 𝑠𝑖𝑑𝑒𝑠
[𝑌 − 𝐸 [𝑌]]2 = 𝑎2 [𝑋 − 𝐸 [𝑋]]2
𝑇𝑎𝑘𝑖𝑛𝑔 𝑒𝑥𝑝𝑒𝑐𝑡𝑎𝑡𝑖𝑜𝑛
𝐸{[𝑌 − 𝐸 [𝑌]]2 } = 𝑎2 𝐸{[𝑋 − 𝐸 [𝑋]]2 }
𝑉 (𝑌) = 𝑎2 𝑉(𝑋)
∴ 𝑉[𝑎𝑥 + 𝑏] = 𝑎2 𝑉(𝑥)
NOTE: 𝟏. 𝑰𝒇 𝒂 = 𝟏 , 𝑽[𝒙 + 𝒃] = 𝑽(𝒙)
2. 𝑰𝒇 𝒂 = 𝟎, 𝑽(𝒃) = 𝟎
3. 𝑰𝒇 𝒃 = 𝟎 , 𝑽[𝒂𝒙] = 𝒂𝟐 𝑽(𝒙)
2) 𝐹𝑜𝑟 𝑎 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑋, 𝐸 [𝑎𝑋 + 𝑏] = 𝑎𝐸 [𝑋] + 𝑏
𝑎𝑛𝑑 𝑉 [𝑎𝑋 + 𝑏] = 𝑎2 𝑉(𝑋)

𝑃𝑟𝑜𝑜𝑓: 𝐸 [𝑎𝑋 + 𝑏] = ∫ (𝑎𝑥 + 𝑏) 𝑓(𝑥 ) 𝑑𝑥
−∞
∞ ∞
= ∫ (𝑎𝑥) 𝑓 (𝑥 ) 𝑑𝑥 + ∫ (𝑏) 𝑓 (𝑥 ) 𝑑𝑥
−∞ −∞
∞ ∞
= 𝑎 ∫ 𝑥 𝑓(𝑥 ) 𝑑𝑥 + 𝑏 ∫ 𝑓 (𝑥 ) 𝑑𝑥
−∞ −∞

𝐸 [𝑎𝑋 + 𝑏] = 𝑎𝐸 [𝑋] + 𝑏

𝐿𝑒𝑡 𝑌 = 𝑎𝑋 + 𝑏 … … . (1)
𝐸 [𝑌] = 𝐸 [𝑎𝑋 + 𝑏] = 𝑎𝐸 [𝑋] + 𝑏 … . . (2)
(1) − (2)

𝑌 − 𝐸 [𝑌] = 𝑎[𝑋 − 𝐸 [𝑋]]


2
𝐸[𝑌 − 𝐸 [𝑌]]2 = 𝑎2 𝐸[𝑋 − 𝐸 [𝑋]]

𝑉𝑎𝑟(𝑌) = 𝑉𝑎𝑟(𝑎𝑋 + 𝑏) = 𝑎2 𝑉𝑎𝑟(𝑋)


2)𝐹𝑜𝑟 𝑎 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 , 𝑉𝑎𝑟[𝑋 + 𝑘] = 𝑉𝑎𝑟 𝑋
2
𝑃𝑟𝑜𝑜𝑓: 𝑉𝑎𝑟[𝑋 + 𝑘] = 𝐸 [[𝑋 + 𝑘]2 ] − [𝐸[𝑋 + 𝑘]]

= 𝐸 [𝑋 2 + 2𝑘𝑋 + 𝐾 2 ] − [𝐸[𝑋] + 𝑘]2


= [𝐸 [𝑋 2 ] + 2𝑘𝐸[𝑋] + 𝑘 2 ] − [(𝐸[𝑋])2 + 2𝑘𝐸[𝑋] + 𝑘 2 ]
= 𝐸 [𝑋 2 ] − [𝐸[𝑋]]2
∴ 𝑉𝑎𝑟[𝑋 + 𝑘] = 𝑉𝑎𝑟 𝑋

Covariance: Definition & Explanation

Covariance measures the relationship between two random variables and indicates whether they
move together (positive covariance) or in opposite directions (negative covariance).

Mathematically, the covariance between two variables X and Y is given by:


1
Cov(X,Y)=𝑛 ∑𝑛𝑖=1(𝑋𝑖 − 𝑋̅)(𝑌𝑖 − 𝑌̅)
Or
Cov(X,Y)=E(XY)-E(X)E(Y)

If X and Y are independent then Cov(X,Y) = 0

Interpreting Covariance:

 Positive covariance (>0): Both variables tend to increase or decrease together.


 Negative covariance (<0): One variable increases while the other decreases.
 Zero covariance (=0): No relationship between the variables.
Example 1: Positive Covariance

Scenario: Hours Studied vs. Exam Scores

Student Hours Studied (X) Exam Score (Y)


A 2 50
B 4 60
C 6 70
D 8 80

The covariance will be positive, indicating that as study hours increase, exam scores also tend to
increase.

Example 2: Negative Covariance

Scenario: Temperature vs. Sweater Sales

Day Temperature (X) Sweaters Sold (Y)


1 30°C 10
2 25°C 20
3 20°C 30
4 15°C 40

Here, as temperature decreases, sweater sales increase. This results in a negative covariance.

Example 3: Zero Covariance

Scenario: Shoe Size vs. Exam Score

If we collect data on shoe size and exam scores, there's no reason to expect any relationship. The
covariance would be close to zero.

𝑪𝑯𝑬𝑩𝒀𝑺𝑯𝑬𝑽’𝑺 𝑰𝑵𝑬𝑸𝑼𝑨𝑳𝑰𝑻𝒀:

𝑺𝒕𝒂𝒕𝒆𝒎𝒆𝒏𝒕: 𝑆𝑢𝑝𝑝𝑜𝑠𝑒 𝑋 𝑖𝑠 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑤𝑖𝑡ℎ 𝑚𝑒𝑎𝑛 𝑚 𝑎𝑛𝑑 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒


2𝑥 𝑡ℎ𝑒𝑛 𝑥 − 𝑚 𝑖𝑠 𝑡ℎ𝑒 𝑚𝑒𝑎𝑠𝑢𝑟𝑒 𝑜𝑓 𝑡ℎ𝑒 𝑎𝑚𝑜𝑢𝑛𝑡 𝑏𝑦 𝑤ℎ𝑖𝑐ℎ 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒
𝑥 𝑑𝑖𝑓𝑓𝑒𝑟𝑠 𝑓𝑟𝑜𝑚 𝑡ℎ𝑒 𝑚𝑒𝑎𝑛 𝑣𝑎𝑙𝑢𝑒 ‘𝑚’ 𝑖𝑛 𝑒𝑖𝑡ℎ𝑒𝑟 𝑑𝑖𝑟𝑒𝑐𝑡𝑖𝑜𝑛.
𝑖. 𝑒. , 𝐼𝑓 𝑋 𝑖𝑠 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑤𝑖𝑡ℎ 𝑚𝑒𝑎𝑛 𝑚 𝑎𝑛𝑑 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 2𝑥 𝑡ℎ𝑒𝑛
1 1
𝑝( 𝑥 − 𝑚𝑘𝑥 ) 2 𝑜𝑟 𝑝{ 𝑥 − 𝑚 < 𝑘𝑥 } 1 − 2
𝑘 𝑘

𝑝𝑟𝑜𝑜𝑓: 𝐿𝑒𝑡 𝑡ℎ𝑒 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑋 ℎ𝑎𝑠 𝑡ℎ𝑒 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠 𝑑𝑒𝑛𝑠𝑖𝑡𝑦 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛
𝑓(𝑥) 𝑡ℎ𝑒𝑛 𝑓𝑟𝑜𝑚 𝑡ℎ𝑒 𝑑𝑒𝑓𝑖𝑛𝑖𝑡𝑖𝑜𝑛 𝑜𝑓 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒,
2𝑥 = 𝐸[𝑥 − 𝐸(𝑥)] 2
= 𝐸[[𝑥 − 𝑚] 2 ]

= ∫ (𝑥 − 𝑚) 2 𝑓(𝑥) 𝑑𝑥
−∞

𝑚−𝑘𝑥 𝑚+𝑘𝑥
=∫ (𝑥 − 𝑚) 2 𝑓(𝑥) 𝑑𝑥 + ∫ (𝑥 − 𝑚) 2 𝑓(𝑥) 𝑑𝑥
−∞ 𝑚−𝑘𝑥

+∫ (𝑥 − 𝑚) 2 𝑓(𝑥) 𝑑𝑥
𝑚+𝑘𝑥

𝑚−𝑘𝑥
2
𝑥  ∫ (𝑥 − 𝑚) 2 𝑓(𝑥) 𝑑𝑥  0
−∞

𝐹𝑜𝑟 𝑡ℎ𝑒 𝑓𝑖𝑟𝑠𝑡 𝑖𝑛𝑡𝑒𝑔𝑟𝑎𝑙 , 𝑡ℎ𝑒 𝑢𝑝𝑝𝑒𝑟 𝑏𝑜𝑢𝑛𝑑 𝑓𝑜𝑟 𝑥 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦


𝑥 𝑚 − 𝑘 𝑥
𝑚 − 𝑥 𝑘𝑥
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑡ℎ𝑖𝑟𝑑 𝑖𝑛𝑡𝑒𝑔𝑟𝑎𝑙 , 𝑡ℎ𝑒 𝑙𝑜𝑤𝑒𝑟 𝑏𝑜𝑢𝑛𝑑 𝑓𝑜𝑟 𝑥 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦
𝑥 𝑚 + 𝑘 𝑥

𝑥 − 𝑚 𝑘𝑥

(𝑥 − 𝑚)2 = (𝑚 − 𝑥)2  𝑘 2 2𝑥


𝑚−𝑘𝑥 ∞
2𝑥 ∫ (𝑥 − 𝑚) 2
𝑓(𝑥) 𝑑𝑥 + ∫ (𝑥 − 𝑚) 2 𝑓(𝑥) 𝑑𝑥
−∞ 𝑚+𝑘𝑥
𝑚−𝑘𝑥 ∞
 𝑘 2 2𝑥 [∫−∞ 𝑓(𝑥) 𝑑𝑥 + ∫𝑚+𝑘 𝑓(𝑥) 𝑑𝑥]
𝑥

2
 𝑘 2𝑥 [𝑝(−ꝏ < 𝑥  𝑚 − 𝑘𝑥 ) + 𝑝(𝑚 + 𝑘𝑥 < 𝑥  ꝏ)]

𝑘 2 2𝑥 [𝑝(𝑥 − 𝑚 − 𝑘𝑥 ) + 𝑝(𝑥 − 𝑚𝑘𝑥 )]

2𝑥  𝑘 2 2𝑥 [𝑝(𝑥 − 𝑚  𝑘𝑥 )]

1𝑘 2 [𝑝(𝑥 − 𝑚𝑘𝑥 )]


1
 𝑝( 𝑥 − 𝑚𝑘𝑥 ) 𝑤ℎ𝑒𝑟𝑒 𝑘 𝑖𝑠 𝑎𝑛𝑦 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡, 𝑘 > 0
𝑘2
𝐴𝑙𝑠𝑜 𝑝(𝑥 − 𝑚𝑘𝑥 ) + 𝑝(𝑥 − 𝑚 < 𝑘 𝑥 ) = 1
 𝑝(𝑥 − 𝑚 < 𝑘𝑥 ) = 1 − 𝑝(𝑥 − 𝑚𝑘𝑥 )
1
𝑝(𝑥 − 𝑚 < 𝑘𝑥 ) 1 − 2
𝑘
1
𝐻𝑒𝑛𝑐𝑒 𝑤𝑒 ℎ𝑎𝑣𝑒 𝑝(𝑥 − 𝑚 < 𝑘𝑥 ) 1 −
𝑘2

𝑃𝑟𝑜𝑏𝑙𝑒𝑚 1: 𝐼𝑓 𝑋 𝑖𝑠 𝑡ℎ𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑠𝑐𝑜𝑟𝑒𝑑 𝑖𝑛 𝑎 𝑡ℎ𝑟𝑜𝑤 𝑜𝑓 𝑎 𝑓𝑎𝑖𝑟 𝑑𝑖𝑒, 𝑠ℎ𝑜𝑤 𝑡ℎ𝑎𝑡 𝑡ℎ𝑒
𝑐ℎ𝑒𝑏𝑦𝑠ℎ𝑒𝑣’𝑠 𝑖𝑛𝑒𝑞𝑢𝑎𝑙𝑖𝑡𝑦 𝑔𝑖𝑣𝑒𝑠 𝑝( 𝑥 − 𝑚2.5) < 0.47 𝑤ℎ𝑒𝑟𝑒 𝑚 𝑖𝑠 𝑡ℎ𝑒 𝑚𝑒𝑎𝑛
𝑜𝑓 𝑋, 𝑤ℎ𝑖𝑙𝑒 𝑡ℎ𝑒 𝑎𝑐𝑡𝑢𝑎𝑙 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑖𝑠 𝑧𝑒𝑟𝑜.
𝑆𝑜𝑙: 𝐿𝑒𝑡 𝑋 𝑏𝑒 𝑡ℎ𝑒 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑤ℎ𝑖𝑐ℎ 𝑡𝑎𝑘𝑒 𝑡ℎ𝑒 𝑣𝑎𝑙𝑢𝑒𝑠 1,2,3,4,5,6 𝑤𝑖𝑡ℎ 𝑡ℎ𝑒
1
𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 .
6

𝐻𝑒𝑛𝑐𝑒 𝑚𝑒𝑎𝑛 = 𝐸(𝑥) = ∑ 𝑥 𝑝(𝑥 )

1 1 1 1 1 1
= (1 ∗ ) + (2 ∗ ) + (3 ∗ ) + (4 ∗ ) + (5 ∗ ) + (6 ∗ )
6 6 6 6 6 6
7
= = m=3.5
2

𝐸(𝑥 2 ) = ∑ 𝑥 2 𝑝(𝑥 )

1 1 1 1 1 1
= (12 ∗ ) + (22 ∗ ) + (32 ∗ ) + (42 ∗ ) + (52 ∗ ) + (62 ∗ )
6 6 6 6 6 6
1 91
= (12 + 22 + 32 + 42 + 52 + 62 ) =
6 6
𝑆𝑖𝑛𝑐𝑒 𝑉(𝑥) = 𝐸(𝑥 2 ) − (𝐸 (𝑥 ))2

− ( )2 = 2.9167 =2𝑥
91 7
=
6 2

1 1
𝑝( 𝑥 − 𝑚𝑘𝑥 ) 𝑜𝑟 𝑝{ 𝑥 − 𝑚 < 𝑘 𝑥 }  1 −
𝑘2 𝑘2

1
𝐶𝑜𝑚𝑝𝑎𝑟𝑖𝑛𝑔 𝑝( 𝑥 − 𝑚2.5) < 0.47 𝑤𝑖𝑡ℎ 𝑝( 𝑥 − 𝑚𝑘𝑥 ) 2 , 𝑤𝑒 ℎ𝑎𝑣𝑒
𝑘
2.5 2.5
𝑘𝑥 = 2.5 → 𝑘 = →𝑘= → 𝑘 = 1.4638
𝑥 √2.9167
𝑘 2 = 2.1428

1
= 0.466679 = 0.47
𝑘2
1
Substituting the values in 𝑝( 𝑥 − 𝑚𝑘𝑥 )
𝑘2

We get 𝑝( 𝑥 − 𝑚2.5) < 0.47


(𝑠𝑖𝑛𝑐𝑒𝑥 − 3.5 = ±(𝑥 − 3.5)
𝑖, 𝑒 𝑥 − 3.5 > 2.5 → 𝑥 > 3.5 + 2.5
−(𝑥 − 3.5) > 2.5 → −𝑥 + 3.5 > 2.5 → 3.5 − 2.5 > 𝑥)
𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝑝( 𝑥 − 3.5 > 2.5) = 𝑃 (3.5 − 2.5 > 𝑥 > 3.5 + 2.5)
= 𝑃(1 > 𝑥 > 6)
=0
(𝑠𝑖𝑛𝑐𝑒 𝑤ℎ𝑒𝑛 𝑎 𝑓𝑎𝑖𝑟 𝑑𝑖𝑒 𝑖𝑠 𝑡ℎ𝑟𝑜𝑤𝑛 , 𝑡ℎ𝑒𝑟𝑒 𝑖𝑠 𝑛𝑜 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑡ℎ𝑎𝑡 𝑎 𝑛𝑢𝑚𝑏𝑒𝑟 𝑠𝑐𝑜𝑟𝑒𝑑 𝑖𝑠
𝑙𝑒𝑠𝑠 𝑡ℎ𝑎𝑛 1 𝑜𝑟 𝑔𝑟𝑒𝑎𝑡𝑒𝑟 𝑡ℎ𝑎𝑛 6)
𝐻𝑒𝑛𝑐𝑒 𝑡ℎ𝑒 𝑡𝑜𝑡𝑎𝑙 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑖𝑠 𝑧𝑒𝑟𝑜.

𝑃𝑟𝑜𝑏𝑙𝑒𝑚2: 𝑇𝑤𝑜 𝑢𝑛𝑏𝑖𝑎𝑠𝑒𝑑 𝑑𝑖𝑐𝑒 𝑎𝑟𝑒 𝑡ℎ𝑟𝑜𝑤𝑛. 𝐿𝑒𝑡 𝑋 𝑏𝑒 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑡ℎ𝑎𝑡
𝑟𝑒𝑝𝑟𝑒𝑠𝑒𝑛𝑡𝑠 𝑡ℎ𝑒 𝑠𝑢𝑚 𝑜𝑓 𝑛𝑢𝑚𝑏𝑒𝑟𝑠 𝑠ℎ𝑜𝑤𝑖𝑛𝑔 𝑢𝑝. 𝑃𝑟𝑜𝑣𝑒 𝑡ℎ𝑎𝑡 𝑏𝑦 𝑐ℎ𝑒𝑏𝑦𝑠ℎ𝑒𝑣’𝑠
35
𝑖𝑛𝑒𝑞𝑢𝑎𝑙𝑖𝑡𝑦 𝑝( 𝑥 − 72) < . 𝑐𝑜𝑚𝑝𝑎𝑟𝑒 𝑡ℎ𝑖𝑠 𝑟𝑒𝑠𝑢𝑙𝑡 𝑤𝑖𝑡ℎ 𝑡ℎ𝑒 𝑎𝑐𝑡𝑢𝑎𝑙
24
𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦.
𝑆𝑜𝑙: 𝐿𝑒𝑡 𝑋 𝑏𝑒 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑡ℎ𝑎𝑡 𝑟𝑒𝑝𝑟𝑒𝑠𝑒𝑛𝑡𝑠 𝑡ℎ𝑒 𝑠𝑢𝑚 𝑜𝑓 𝑛𝑢𝑚𝑏𝑒𝑟𝑠 𝑠ℎ𝑜𝑤𝑖𝑛𝑔 𝑢𝑝.
𝑇ℎ𝑒 𝑝𝑜𝑠𝑠𝑖𝑏𝑙𝑒 𝑛𝑢𝑚𝑏𝑒𝑟𝑠 𝑎𝑟𝑒 2,3,4,5,6,7,8,9,10,11,12.
𝑇ℎ𝑒 𝑡𝑜𝑡𝑎𝑙 𝑝𝑜𝑠𝑠𝑖𝑏𝑙𝑒 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 𝑎𝑟𝑒 36.
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 2, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (1,1) 𝑖. 𝑒, 𝑚 = 1
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 3, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (1,2), (2,1) 𝑖. 𝑒, 𝑚 = 2
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 4, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (1,3), (2,2), (3,1) 𝑖. 𝑒, 𝑚 = 3
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 5, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (1,4), (4,1), (2,3), (3,2) 𝑖. 𝑒, 𝑚 = 4
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 6, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (1,5), (5,1), (2,4), (4,2), (3,3) 𝑖. 𝑒, 𝑚 = 5
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 7, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (1,6), (6,1), (2,5), (5,2), (3,4), (4,3) 𝑖. 𝑒, 𝑚 = 6
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 8, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (2,6), (6,2), (3,5), (5,3), (4,4) 𝑖. 𝑒, 𝑚 = 5
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 9, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (3,6), (6,3), (4,5), (5,4) 𝑖. 𝑒, 𝑚 = 4
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 10, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (5,5), (4,6), (6,4) 𝑖. 𝑒, 𝑚 = 3
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 11, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (5,6), (6,5) 𝑖. 𝑒, 𝑚 = 2
𝐹𝑜𝑟 𝑡ℎ𝑒 𝑛𝑜 12, 𝑠𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑎𝑟𝑒 (6,6) 𝑖. 𝑒, 𝑚 = 1
𝑚
𝑇ℎ𝑒 𝑟𝑒𝑠𝑝𝑒𝑐𝑡𝑖𝑣𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑖𝑒𝑠 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦
𝑛
𝑇ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦
𝑥 2 3 4 5 6 7 8 9 10 11 12
𝑃(𝑥) 1 2 3 4 5 6 5 4 3 2 1
36 36 36 36 36 36 36 36 36 36 36

𝐻𝑒𝑛𝑐𝑒 𝑚𝑒𝑎𝑛 = 𝐸(𝑥) = ∑ 𝑥 𝑝(𝑥 )

1 2 3 4 5 6
= (2 ∗ ) + (3 ∗ ) ) + (4 ∗ ) ) + (5 ∗ ) ) + (6 ∗ ) + (7 ∗ ) + (8
36 36 36 36 36 36
5 4 3 2 1
∗ ) + (9 ∗ ) + (10 ∗ ) ) + (11 ∗ ) + (12 ∗ )
36 36 36 36 36
1 252
= (2 + 6 + 12 + 20 + 30 + 42 + 40 + 36 + 30 + 22 + 12) = =7
36 36

𝐸(𝑥 2 ) = ∑ 𝑥 2 𝑝(𝑥 )

1 2 3 4 5 6
= (4 ∗ ) + (9 ∗ ) + (16 ∗ ) + (25 ∗ ) + (36 ∗ ) + (49 ∗ ) + (64
36 36 36 36 36 36
5 4 3 2 1
∗ ) + (81 ∗ ) + (100 ∗ ) + (121 ∗ ) + (144 ∗ )
36 36 36 36 36
1
= (4 + 18 + 48 + 100 + 180 + 294 + 320 + 324 + 300 + 242 + 144)
36
1974 329
= =
36 6

𝑆𝑖𝑛𝑐𝑒 𝑉(𝑥) = 𝐸(𝑥 2 ) − (𝐸 (𝑥 ))2


329 35
= − 49 =
6 6
35 𝑣(𝑥)
𝐵𝑦 𝐶𝑜𝑚𝑝𝑎𝑟𝑖𝑛𝑔 𝑝( 𝑥 − 72) < 𝑤𝑖𝑡ℎ 𝑃[𝑥 − 𝑚 > 𝐸] < 2 , 𝑤𝑒 ℎ𝑎𝑣𝑒 𝐸
24 𝐸
𝑣(𝑥) 35
= 2, =
𝐸2 24

𝑇ℎ𝑒 𝑎𝑐𝑡𝑢𝑎𝑙 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦


𝑃{𝑥 − 72} = 1 − 𝑃{𝑥 − 7 < 2}
= 1 − 𝑃(7 − 2 < 𝑥 < 7 + 2)
= 1 − 𝑃(5 < 𝑥 < 9)
= 1 − [𝑝(𝑥 = 6) + 𝑝(𝑥 = 7) + 𝑝(𝑥 = 8)]
5 6 5
=1−[ + + ]
36 36 36
16 20 5
=1− = =
36 36 9

𝑃𝑅𝑂𝐵𝐴𝐵𝐼𝐿𝐼𝑇𝑌 𝐷𝐼𝑆𝑇𝑅𝐼𝐵𝑈𝑇𝐼𝑂𝑁𝑆
𝑫𝒊𝒔𝒄𝒓𝒆𝒕𝒆

𝑩𝒊𝒏𝒐𝒎𝒊𝒂𝒍 𝑷𝒐𝒔𝒔𝒊𝒐𝒏
𝑩𝒆𝒓𝒏𝒐𝒖𝒍𝒍𝒊′𝒔 𝒅𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏:
𝐼𝑛 𝑡ℎ𝑖𝑠 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑡ℎ𝑒𝑟𝑒 𝑎𝑟𝑒 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑑𝑒𝑛𝑜𝑡𝑒𝑑 𝑏𝑦 𝑝 , 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 𝑞
𝑤𝑖𝑡ℎ 𝑥 𝑑𝑒𝑛𝑜𝑡𝑖𝑛𝑔 𝑡ℎ𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑓𝑜𝑟 𝑤ℎ𝑖𝑐ℎ 𝑜𝑛𝑙𝑦 𝑎 𝑠𝑖𝑛𝑔𝑙𝑒 𝑡𝑟𝑖𝑎𝑙 𝑖𝑠 𝑐𝑜𝑛𝑠𝑖𝑑𝑒𝑟𝑒𝑑.
𝑫𝒆𝒇𝒊𝒏𝒊𝒕𝒊𝒐𝒏:

𝐴 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑋 𝑖𝑠 𝑠𝑒𝑡 𝑡𝑜 𝑓𝑜𝑙𝑙𝑜𝑤 𝑡ℎ𝑒 𝐵𝑒𝑟𝑛𝑜𝑢𝑙𝑙𝑖 ′ 𝑠 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑓 𝑖𝑡𝑠 𝑚𝑎𝑠𝑠

𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦 𝑃 (𝑥) = 𝑝 𝑥 . 𝑞1−𝑥 𝑥 = 0,1,2, ….

𝐼𝑓 𝑥 = 0 , 𝑃 (0) = 𝑝 0 . 𝑞1−0 = 𝑝0 . 𝑞1 = 𝑞

𝐼𝑓 𝑥 = 1 , 𝑃 (1) = 𝑝1 . 𝑞1−1 = 𝑝1 . 𝑞0 = 𝑝

∴ 𝑃 (0) = 𝑞 𝑎𝑛𝑑 𝑃(1) = 𝑝


𝑩𝒊𝒏𝒐𝒎𝒊𝒂𝒍 𝒅𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏𝒔:
𝑇ℎ𝑒𝑟𝑒 𝑎𝑟𝑒 𝑛 − 𝑡𝑟𝑖𝑎𝑙𝑠 𝑤ℎ𝑖𝑐ℎ 𝑎𝑟𝑒 𝑐𝑜𝑛𝑠𝑖𝑑𝑒𝑟𝑒𝑑 𝑤𝑖𝑡ℎ 𝑝, 𝑞 𝑎𝑠 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠

𝑎𝑛𝑑 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 𝑤𝑖𝑡ℎ 𝑛 𝑡𝑜 𝑏𝑒 𝑓𝑖𝑛𝑖𝑡𝑒.


𝑫𝒆𝒇𝒊𝒏𝒊𝒕𝒊𝒐𝒏:
𝐴 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑋 𝑖𝑠 𝑠𝑎𝑖𝑑 𝑡𝑜 𝑓𝑜𝑙𝑙𝑜𝑤 𝑡ℎ𝑒 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑓 𝑖𝑡𝑠 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦

𝑚𝑎𝑠𝑠 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦 𝑃 (𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥 𝑓𝑜𝑟 𝑥 = 0,1,2, … … . 𝑛

𝑤ℎ𝑒𝑟𝑒 𝑥 = 𝑛𝑜. 𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠


𝑝 = 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠
𝑞 = 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑓𝑎𝑖𝑙𝑢𝑟𝑒
𝑛 = 𝑛𝑜. 𝑜𝑓 𝑡𝑟𝑖𝑎𝑙𝑠

𝑝+𝑞 = 1

𝑽𝒆𝒓𝒊𝒇𝒊𝒄𝒂𝒕𝒊𝒐𝒏 𝒐𝒇 𝒑(𝒙) 𝒂𝒔 𝒕𝒉𝒆 𝒑𝒓𝒐𝒃𝒂𝒃𝒊𝒍𝒊𝒕𝒚 𝒎𝒂𝒔𝒔 𝒇𝒖𝒏𝒄𝒕𝒊𝒐𝒏:


1) 𝑠𝑖𝑛𝑐𝑒 𝑥, 𝑝, 𝑞, 𝑛 𝑎𝑟𝑒 𝑎𝑙𝑙 𝑛𝑜𝑛 − 𝑛𝑒𝑔𝑎𝑡𝑖𝑣𝑒 → 𝑃(𝑥) ≥ 0
2) 𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑒𝑥𝑝𝑎𝑛𝑠𝑖𝑜𝑛

(𝑞 + 𝑝)𝑛 = 𝑛𝑐0 𝑞𝑛 𝑝 0 + 𝑛𝑐1 𝑞𝑛−1 𝑝1 + 𝑛𝑐2 𝑞𝑛−2 𝑝2 + ⋯ … … . +𝑛𝑐𝑛 𝑞0 𝑝𝑛 . . . . . . (1)


𝑛 𝑛

𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 ∑ 𝑃 (𝑥) = ∑ 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥


𝑥=0 𝑥=0

= 𝑛𝑐0 𝑝0 𝑞𝑛 + 𝑛𝑐1 𝑝1 𝑞𝑛−1 + 𝑛𝑐2 𝑝2 𝑞𝑛−2 + ⋯ … … . +𝑛𝑐𝑛 𝑝 𝑛 𝑞0 .

∑𝑛𝑥=0 𝑃 (𝑥) = 𝑛𝑐0 𝑞𝑛 + 𝑛𝑐1 𝑝1 𝑞𝑛−1 + 𝑛𝑐2 𝑝2 𝑞𝑛−2 + ⋯ … … . +𝑛𝑐𝑛 𝑝𝑛 . . . . . . . (2)

𝐹𝑟𝑜𝑚 (1) 𝑎𝑛𝑑 (2) 𝑤𝑒 𝑔𝑒𝑡


𝑛
𝑛
(𝑞 + 𝑝) = ∑ 𝑃 (𝑥)
𝑥=0

𝑇ℎ𝑢𝑠 ∑ 𝑃 (𝑥) = 1
𝑥=0

𝑨𝒔𝒔𝒖𝒎𝒑𝒕𝒊𝒐𝒏𝒔 𝒖𝒏𝒅𝒆𝒓 𝒘𝒉𝒊𝒄𝒉 𝒃𝒊𝒏𝒐𝒎𝒊𝒂𝒍 𝒅𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏 𝒘𝒐𝒓𝒌𝒔:


1) 𝑇ℎ𝑒𝑟𝑒 𝑎𝑟𝑒 𝑛 − 𝑡𝑟𝑖𝑎𝑙𝑠 𝑤ℎ𝑖𝑐ℎ 𝑎𝑟𝑒 𝑟𝑒𝑝𝑒𝑎𝑡𝑒𝑑 𝑢𝑛𝑑𝑒𝑟 𝑖𝑑𝑒𝑛𝑡𝑖𝑐𝑎𝑙 𝑐𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛𝑠.
2) 𝐸𝑎𝑐ℎ 𝑡𝑟𝑖𝑎𝑙 ℎ𝑎𝑠 𝑡𝑤𝑜 𝑝𝑜𝑠𝑠𝑖𝑏𝑙𝑒 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 𝑖. 𝑒. , 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑜𝑟 𝑓𝑎𝑖𝑙𝑢𝑟𝑒.
3) 𝑇ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑜𝑟 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 𝑟𝑒𝑚𝑎𝑖𝑛𝑠 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑡ℎ𝑒 𝑡𝑟𝑖𝑎𝑙𝑠.

4) 𝑇ℎ𝑒 𝑡𝑟𝑖𝑎𝑙𝑠 𝑎𝑟𝑒 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑖. 𝑒. 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑜𝑟 𝑓𝑎𝑖𝑙𝑢𝑟𝑒 𝑜𝑓 𝑜𝑛𝑒 𝑡𝑟𝑖𝑎𝑙
𝑤𝑖𝑙𝑙 𝑛𝑜𝑡 𝑒𝑓𝑓𝑒𝑐𝑡 𝑡ℎ𝑒 𝑜𝑡ℎ𝑒𝑟.
𝑴𝒆𝒂𝒏:

𝑀𝑒𝑎𝑛 = 𝐸[𝑋] = ∑ 𝑥. 𝑃 (𝑥)


𝑛

= ∑ 𝑥. 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥
𝑥=0

= 0. 𝑛𝑐0 𝑝0 𝑞𝑛 + 1. 𝑛𝑐1 𝑝1 𝑞𝑛−1 + 2. 𝑛𝑐2 𝑝2 𝑞𝑛−2 + ⋯ … … . +𝑛. 𝑛𝑐𝑛 𝑝𝑛 𝑞𝑛−𝑛

𝑛(𝑛 − 1)
= 𝑛. 𝑝. 𝑞𝑛−1 + 2. +. . . . . . +𝑛. 𝑝𝑛
2!

= 𝑛𝑝 [(𝑛 − 1)𝑐0 𝑞𝑛−1 + (𝑛 − 1)𝑐1 𝑝1 𝑞(𝑛−1)−1 +. . . . . . +(𝑛 − 1)𝑐(𝑛−1) 𝑝𝑛−1 ]

= 𝑛𝑝[𝑞 + 𝑝] 𝑛−1

∴ 𝑀𝑒𝑎𝑛 = 𝑛𝑝

𝑽𝒂𝒓𝒊𝒂𝒏𝒄𝒆:
𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = 𝐸[𝑋 2 ] − (𝐸[𝑋])2
𝑛

= ∑ 𝑥 2 . 𝑃(𝑥) − (𝑛𝑝)2
𝑥=0
𝑛

= ∑[𝑥 2 + 𝑥 − 𝑥 ] 𝑃 (𝑥) − 𝑛2 𝑝2
𝑥=0
𝑛

= ∑[𝑥(𝑥 − 1) + 𝑥 ] 𝑃 (𝑥) − 𝑛2 𝑝2
𝑥=0
𝑛 𝑛

= ∑ 𝑥 (𝑥 − 1)𝑃 (𝑥) + ∑ 𝑥 𝑃(𝑥) − 𝑛2 𝑝2


𝑥=0 𝑥=0
𝑛

= ∑ 𝑥(𝑥 − 1) 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥 + 𝑛𝑝 − 𝑛2 𝑝2


𝑥=2

= [2.1 𝑛𝑐2 𝑝2 𝑞𝑛−2 + 3.2 𝑛𝑐3 𝑝3 𝑞𝑛−3 +. . . . . . . . +𝑛(𝑛 − 1) 𝑛𝑐𝑛 𝑝𝑛 𝑞𝑛−𝑛 ] + 𝑛𝑝 − 𝑛2 𝑝2

𝑛(𝑛 − 1) 2 𝑛−2 𝑛(𝑛 − 1)(𝑛 − 2) 3 𝑛−3


= [2. 𝑝 𝑞 + 3.2 𝑝 𝑞 +. . . . . . +𝑛(𝑛 − 1)𝑝𝑛 ] + 𝑛𝑝 − 𝑛2 𝑝2
1.2 1.2.3
= 𝑛(𝑛 − 1)𝑝2 [𝑞𝑛−2 + (𝑛 − 2)𝑝1 𝑞(𝑛−2)−1 +. . . . . +𝑝𝑛−2 ] + 𝑛𝑝 − 𝑛2 𝑝2

= 𝑛(𝑛 − 1)𝑝2 [𝑞 + 𝑝]𝑛−2 + 𝑛𝑝 − 𝑛2 𝑝2


= [𝑛2 𝑝2 − 𝑛𝑝2 ][𝑞 + 𝑝]𝑛−2 + 𝑛𝑝 − 𝑛2 𝑝2

= 𝑛2 𝑝 2 − 𝑛𝑝2 + 𝑛𝑝 − 𝑛2 𝑝2

= 𝑛𝑝 − 𝑛𝑝2
= 𝑛𝑝(1 − 𝑝) = 𝑛𝑝𝑞

∴ 𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = 𝑛𝑝𝑞

𝑴𝒐𝒎𝒆𝒏𝒕 𝑮𝒆𝒏𝒆𝒓𝒂𝒕𝒊𝒏𝒈 𝑭𝒖𝒏𝒄𝒕𝒊𝒐𝒏:

𝑀𝑋 (𝑡) = 𝐸[𝑒 𝑡𝑋 ]

= ∑ 𝑒 𝑡𝑥 𝑃(𝑥)
𝑥

= ∑ 𝑒 𝑡𝑥 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥
𝑥

= ∑ 𝑛𝑐𝑥 (𝑝𝑒 𝑡 )𝑥 𝑞𝑛−𝑥


𝑥

∴ 𝑀𝑋 (𝑡) = (𝑞 + 𝑝𝑒 𝑡 )𝑛
𝑷𝒓𝒐𝒃𝒍𝒆𝒎𝒔:
1) 𝑇ℎ𝑒 𝑖𝑛𝑐𝑖𝑑𝑒𝑛𝑡𝑠 𝑜𝑓𝑎 𝑑𝑖𝑠𝑒𝑎𝑠𝑒 𝑖𝑛 𝑎𝑛 𝑖𝑛𝑑𝑢𝑠𝑡𝑟𝑦 𝑖𝑠 𝑠𝑢𝑐ℎ 𝑡ℎ𝑎𝑡 𝑡ℎ𝑒 𝑤𝑜𝑟𝑘𝑒𝑟𝑠 ℎ𝑎𝑣𝑒 20% 𝑜𝑓 𝑐ℎ𝑎𝑛𝑐𝑒
𝑜𝑓 𝑠𝑢𝑓𝑓𝑒𝑟𝑖𝑛𝑔 𝑓𝑟𝑜𝑚 𝑡ℎ𝑒 𝑑𝑖𝑠𝑒𝑎𝑠𝑒. 𝑊ℎ𝑎𝑡 𝑖𝑠 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑡ℎ𝑎𝑡 𝑜𝑢𝑡 𝑜𝑓 6 𝑤𝑜𝑟𝑘𝑒𝑟𝑠 𝑐ℎ𝑜𝑜𝑠𝑒𝑛 𝑎𝑡
𝑟𝑎𝑛𝑑𝑜𝑚 4 𝑎𝑟𝑒 𝑚𝑜𝑟𝑒 𝑤𝑖𝑙𝑙 𝑠𝑢𝑓𝑓𝑒𝑟 𝑤𝑖𝑡ℎ 𝑡ℎ𝑒 𝑑𝑖𝑠𝑒𝑎𝑠𝑒.
20
𝑆𝑜𝑙: 𝑤𝑒 ℎ𝑎𝑣𝑒 𝑝 = = 0.2
100
𝑤𝑒 ℎ𝑎𝑣𝑒 𝑝 + 𝑞 = 1
⇒ 𝑞 = 1 − 𝑝 = 1 − 0.2 = 0.8
𝑛 = 6, 𝑥 = 4,5,6
∴ 𝑃 (𝑋 ≥ 4) = 𝑃 (𝑋 = 4) + 𝑃 (𝑋 = 5) + 𝑃(𝑋 = 6)

𝑠𝑖𝑛𝑐𝑒 𝑃(𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥

= 6𝑐4 (0.2)4 (0.8)6−4 + 6𝑐5 (0.2)5 (0.8)6−5 + 6𝑐6 (0.2)6 (0.8)6−6

6×5
= (0.2)4 (0.8)2 + 6(0.2)5 (0.8)1 + (0.2)6
1×2
= 0.01696
2) 𝐹𝑜𝑟 𝑎 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑣𝑎𝑟𝑖𝑎𝑡𝑒 𝑡ℎ𝑒 𝑚𝑒𝑎𝑛 𝑖𝑠 4 𝑎𝑛𝑑 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 𝑖𝑠 3. 𝐷𝑒𝑡𝑒𝑟𝑚𝑖𝑛𝑒 𝑡ℎ𝑒 𝑝𝑎𝑟𝑎𝑚𝑒𝑡𝑒𝑟𝑠 𝑜𝑓
𝑡ℎ𝑒 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛.

𝑆𝑜𝑙: 𝑋~𝑏(𝑛, 𝑝)

𝐺𝑖𝑣𝑒𝑛 𝑡ℎ𝑎𝑡 𝑛𝑝 = 4 … … . . (1)


𝑛𝑝𝑞 = 3 … … … (2)
(2) 𝑛𝑝𝑞 3
⇒ =
(1) 𝑛𝑞 4

3
⇒ 𝑞=
4

𝑤𝑒 ℎ𝑎𝑣𝑒 𝑝 + 𝑞 = 1 ⇒ 𝑝 = 1 − 𝑞
3
𝑝 = 1−
4
1
⇒ 𝑝=
4
𝑠𝑢𝑏𝑠𝑡𝑖𝑡𝑢𝑡𝑒 𝑖𝑛 (1) 𝑤𝑒 𝑔𝑒𝑡
1
𝑛 [ ] = 4 ⇒ 𝑛 = 16
4
1
∴ 𝑋~𝑏 (16, )
4

3) 𝐴 𝑐𝑜𝑖𝑛 𝑤ℎ𝑖𝑐ℎ 𝑖𝑠 𝑓𝑎𝑖𝑟 𝑜𝑟 𝑢𝑛𝑏𝑖𝑜𝑠𝑒𝑑 𝑡𝑜𝑠𝑠𝑒𝑑 𝑓𝑜𝑟 6 𝑡𝑖𝑚𝑒𝑠. 𝐹𝑖𝑛𝑑 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓
𝑔𝑒𝑡𝑡𝑖𝑛𝑔 𝑎𝑡𝑚𝑜𝑠𝑡 4
1
𝑆𝑜𝑙: 𝐺𝑖𝑣𝑒𝑛 𝑡ℎ𝑎𝑡 𝑛 = 6, 𝑝 = 𝑞 = , 𝑃[𝑋 ≤ 4]
2
𝑃 (𝑋 ≤ 4) = 𝑃 (𝑋 = 1) + 𝑃 (𝑋 = 2) + 𝑃 (𝑋 = 3) + 𝑃(𝑋 = 4)
(𝑜𝑟) 𝑃(𝑋 ≤ 4) = 1 − 𝑃[𝑋 > 4]

= 1 − 𝑃 (𝑋 = 5) + 𝑃(𝑋 = 6)

1 5 1 6−5 16
= 1 − [6𝑐5 [ ] [ ] + 6𝑐6 [ ] ]
2 2 2

16 16
= 1 − [6 [ ] + [ ] ]
2 2

16
= 1 − [[ ] (7)]
2
∴ 𝑃(𝑋 ≤ 4) = 0.8906

4) 𝐼𝑡 ℎ𝑎𝑠 𝑏𝑒𝑒𝑛 𝑐𝑙𝑎𝑖𝑚𝑒𝑑 𝑡ℎ𝑎𝑡 60% 𝑜𝑓 𝑎𝑙𝑙 𝑠𝑜𝑙𝑎𝑟 𝑖𝑛𝑠𝑡𝑎𝑙𝑙𝑎𝑡𝑖𝑜𝑛𝑠 𝑙𝑒𝑎𝑑𝑠 𝑡𝑜 𝑡ℎ𝑒 𝑟𝑒𝑑𝑢𝑐𝑡𝑖𝑜𝑛 𝑖𝑛
1
𝑢𝑡𝑖𝑙𝑖𝑡𝑦 𝑏𝑖𝑙𝑙 𝑏𝑦 𝑟𝑑 𝑜𝑓 𝑖𝑡𝑠 𝑣𝑎𝑙𝑢𝑒. 𝐴𝑐𝑐𝑜𝑟𝑑𝑖𝑛𝑔𝑙𝑦 𝑤ℎ𝑎𝑡 𝑎𝑟𝑒 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑖𝑒𝑠 𝑓𝑜𝑟 𝑡ℎ𝑒 𝑢𝑡𝑖𝑙𝑖𝑡𝑦
3
1
𝑏𝑖𝑙𝑙 𝑡𝑜 𝑏𝑒 𝑟𝑒𝑑𝑢𝑐𝑒𝑑 𝑏𝑦 𝑟𝑑 𝑜𝑓 𝑖𝑡𝑠 𝑣𝑎𝑙𝑢𝑒𝑠 𝑖𝑛 4 𝑜𝑟 ≥ 4 𝑜𝑢𝑡 𝑜𝑓 5 𝑖𝑛𝑠𝑡𝑎𝑙𝑙𝑎𝑡𝑖𝑜𝑛𝑠.
3
60
𝑆𝑜𝑙: 𝑤𝑒 ℎ𝑎𝑣𝑒 𝑝 = = 0.6
100
𝑤𝑒 ℎ𝑎𝑣𝑒 𝑝 + 𝑞 = 1

⇒ 𝑞 = 1 − 𝑝 = 1 − 0.6 = 0.4
𝑤𝑒 𝑘𝑛𝑜𝑤 𝑡ℎ𝑎𝑡 𝑃 (𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥

𝑛=5

𝑎𝑡 𝑥 = 4 𝑎𝑡 5
∴ 𝑃 (𝑋 = 4) + 𝑃 (𝑋 = 5)
= 5𝑐4 (0.6)4 (0.4)1 + 5𝑐5 (0.6)5 (0.4)0

= 5(0.05184) + 0.0777
∴ 𝑃 (𝑋 = 4) + 𝑃 (𝑋 = 5) = 0.33696
5) 𝐼𝑓 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑑𝑒𝑓𝑒𝑐𝑡𝑖𝑣𝑒 𝑏𝑜𝑙𝑡 𝑖𝑠 0.2. 𝐹𝑖𝑛𝑑 𝑡ℎ𝑒 𝑚𝑒𝑎𝑛, 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 𝑎𝑛𝑑 𝑡ℎ𝑒 𝑆. 𝐷 𝑓𝑜𝑟

𝑡ℎ𝑒𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑛 𝑤ℎ𝑖𝑐ℎ 40 𝑏𝑜𝑙𝑡𝑠 𝑎𝑟𝑒 𝑐ℎ𝑜𝑜𝑠𝑒𝑛 𝑓𝑜𝑟 𝑠𝑖𝑚𝑝𝑙𝑒.

𝑆𝑜𝑙: 𝐺𝑖𝑣𝑒𝑛 𝑝 = 0.2


𝑤𝑒 ℎ𝑎𝑣𝑒 𝑝 + 𝑞 = 1

⇒ 𝑞 = 1 − 𝑝 = 1 − 0.2 = 0.8
𝑛 = 40
𝑀𝑒𝑎𝑛 = 𝑛𝑝 = 40 × (0.2) = 8

𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = 𝑛𝑝𝑞 = 40 × (0.2) × (0.8) = 6.4

∴ 𝑆. 𝐷 = √𝑛𝑝𝑞 = √6.4 = 2.5298

𝑹𝒆𝒄𝒖𝒓𝒓𝒆𝒏𝒄𝒆 𝒓𝒆𝒍𝒂𝒕𝒊𝒐𝒏 𝒃𝒆𝒕𝒘𝒆𝒆𝒏 𝒑𝒓𝒐𝒃𝒂𝒃𝒊𝒍𝒊𝒕𝒊𝒆𝒔:

𝑃 (𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥

𝑃 (𝑥 + 1) = 𝑛𝑐𝑥+1 . 𝑝 𝑥+1 . 𝑞𝑛−(𝑥+1)


𝑃(𝑥 + 1) 𝑛𝑐𝑥+1 . 𝑝 𝑥+1 . 𝑞𝑛−(𝑥+1)
=
𝑃(𝑥) 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥

𝑛𝑐𝑥+1 . 𝑝 𝑥 . 𝑝. 𝑞𝑛−𝑥 . 𝑞 −1
=
𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥
𝑛𝑐𝑥+1 𝑝
= .
𝑛𝑐𝑥 𝑞
𝑛!
(𝑥 + 1)! (𝑛 − (𝑥 + 1)! 𝑝
= .
𝑛! 𝑞
𝑥! (𝑛 − 𝑥)!
𝑃(𝑥 + 1) 𝑛 − 𝑥 𝑝
= .
𝑃(𝑥) 𝑥+1 𝑞
𝑛−𝑥 𝑝
∴ 𝑃(𝑥 + 1) = . . 𝑃(𝑥) 𝑤ℎ𝑒𝑟𝑒 𝑥 = 0,1,2, … …
𝑥+1 𝑞

𝑛−0 𝑝
𝐴𝑡 𝑥 = 0 , 𝑃 (1) = . . 𝑃(0)
0+1 𝑞
1) 𝐹𝑜𝑟 𝑎 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑤𝑖𝑡ℎ 𝑝 = 0.4 𝑎𝑛𝑑 𝑛 = 5. 𝐸𝑣𝑎𝑙𝑢𝑎𝑡𝑒 𝑎𝑙𝑙 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑖𝑒𝑠

𝑢𝑠𝑖𝑛𝑔 𝑡ℎ𝑒 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛𝑠ℎ𝑖𝑝.

𝑆𝑜𝑙: 𝐺𝑖𝑣𝑒𝑛 𝑝 = 0.4, 𝑞 = 1 − 𝑝 = 1 − 0.4 = 0.6


𝑛=5
𝑃 (𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥

𝑃 (0) = 5𝑐0 . (0.4)0 . (0.6)5−0 = (0.6)5

= 0.0778
𝑛−𝑥 𝑝
𝑃 (𝑥 + 1) = . . 𝑃(𝑥)
𝑥+1 𝑞
5 − 0 (0.4)
𝐴𝑡 𝑥 = 0, 𝑃 (1) = . . (0.0778)
0 + 1 (0.6)
∴ 𝑃 (1) = 0.2593
5 − 1 (0.4)
𝐴𝑡 𝑥 = 1, 𝑃 (2) = . . (0.2593)
1 + 1 (0.6)
(0.4)
= 2. . (0.2593)
(0.6)

= 0.3457
5 − 2 (0.4)
𝐴𝑡 𝑥 = 2, 𝑃 (3) = . . (0.3457)
2 + 1 (0.6)
(0.4)
= 1. . (0.3457)
(0.6)

= 0.2305
5 − 3 (0.4)
𝐴𝑡 𝑥 = 3, 𝑃 (4) = . . (0.2305)
3 + 1 (0.6)
2 (0.4)
= . . (0.2305)
4 (0.6)
= 0.0768
5 − 4 (0.4)
𝐴𝑡 𝑥 = 4, 𝑃 (5) = . . (0.0768)
4 + 1 (0.6)
1 (0.4)
= . . (0.0768)
5 (0.6)
= 0.0102
4
2) 𝐹𝑜𝑟 𝑎 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑣𝑎𝑟𝑖𝑎𝑡𝑒 𝑡ℎ𝑒 𝑚𝑒𝑎𝑛 𝑖𝑠 4 𝑎𝑛𝑑 𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 𝑖𝑠 . 𝐹𝑖𝑛𝑑 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑃(𝑋 ≥ 1)
3
4
𝑆𝑜𝑙: 𝐺𝑖𝑣𝑒𝑛 𝑡ℎ𝑎𝑡 𝑛𝑝 = 4 𝑎𝑛𝑑 𝑛𝑝𝑞 =
3
4
𝑛𝑝𝑞 3 4 1
= = =
𝑛𝑝 4 12 3

1
∴ 𝑞=
3
𝑤𝑒 ℎ𝑎𝑣𝑒 𝑝 + 𝑞 = 1 ⇒ 𝑝 = 1 − 𝑞
1
=1−
3
2
∴ 𝑝=
3

𝑆𝑖𝑛𝑐𝑒 𝑛𝑝 = 4
2
⇒ 𝑛[ ] = 4
3
3×4
⇒𝑛=
2
∴ 𝑛=6
2
𝑋~𝑏 (6, )
3
𝑃 (𝑋 ≥ 1) = 1 − 𝑃[𝑋 < 1] = 1 − 𝑃[𝑋 = 0]

2 0 1 6−0
= 1 − 6𝑐0 ( ) ( )
3 3
1 6
= 1 − ( ) = 1 − 0.0014
3
∴ 𝑃 (𝑋 ≥ 1) = 0.9986
3) 𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚 𝑒𝑥𝑝𝑒𝑟𝑖𝑚𝑒𝑛𝑡 𝑜𝑓 𝑡ℎ𝑟𝑜𝑤𝑖𝑛𝑔 2 𝑑𝑖𝑐𝑒 𝑎𝑡 𝑎 𝑡𝑖𝑚𝑒 𝑓𝑜𝑟 120 𝑡𝑖𝑚𝑒𝑠.

𝐹𝑖𝑛𝑑 𝑡ℎ𝑒 𝑎𝑣𝑒𝑟𝑎𝑔𝑒 𝑛𝑜. 𝑜𝑓 𝑡𝑖𝑚𝑒𝑠, 𝑡ℎ𝑒 𝑛𝑜. 𝑜𝑛 𝑡ℎ𝑒 𝑓𝑖𝑟𝑠𝑡 𝑑𝑖𝑒 𝑒𝑥𝑐𝑒𝑒𝑑𝑠 𝑡ℎ𝑒 𝑛𝑜. 𝑜𝑓 𝑡ℎ𝑒 2𝑛𝑑 𝑑𝑖𝑒
𝑖𝑓 𝑖𝑡 𝑓𝑜𝑙𝑙𝑜𝑤𝑠 𝑡ℎ𝑒 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛.

𝑆𝑜𝑙: 𝐺𝑖𝑣𝑒𝑛 𝑡ℎ𝑎𝑡 𝑛 = 120

{ (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)


(2,1) (2,2) (2,3) (2,4) (2,5) (2,6)
(3,1) (3,2) (3,3) (3,4) (3,5) (3,6)
(4,1) (4,2) (4,3) (4,4) (4,5) (4,6)
(5,1) (5,2) (5,3) (5,4) (5,5) (5,6)
(6,1) (6,2) (6,3) (6,4) (6,5) (6,6) }
15
𝑝= = 0.4167
36
∴ 𝐴𝑣𝑒𝑟𝑎𝑔𝑒 = 𝑀𝑒𝑎𝑛 = 𝑛𝑝 = 120 × 0.4167 = 50
4) 𝑂𝑢𝑡 𝑜𝑓 800 𝑓𝑎𝑚𝑖𝑙𝑖𝑒𝑠 𝑤𝑖𝑡ℎ 5 𝑐ℎ𝑖𝑙𝑑𝑒𝑟𝑛 𝑒𝑎𝑐ℎ ℎ𝑜𝑤 𝑚𝑎𝑛𝑦 𝑤𝑜𝑢𝑙𝑑 𝑦𝑜𝑢 𝑒𝑥𝑝𝑒𝑐𝑡 𝑡𝑜 ℎ𝑎𝑣𝑒
𝑡ℎ𝑒 𝑓𝑜𝑙𝑙𝑜𝑤𝑖𝑛𝑔 𝑖) 3 𝑏𝑜𝑦𝑠 𝑖𝑖) 5 𝑔𝑖𝑟𝑙𝑠 𝑖𝑖𝑖) 𝑒𝑖𝑡ℎ𝑒𝑟 2𝐵 𝑜𝑟 3𝐵 𝑖𝑣) 𝑎𝑡𝑙𝑒𝑎𝑠𝑡 1 𝑏𝑜𝑦
𝑆𝑜𝑙: 𝑋 → 𝐺𝑒𝑡𝑡𝑖𝑛𝑔 𝑎 𝑚𝑎𝑙𝑒 𝑐ℎ𝑖𝑙𝑑
1
𝑝= =𝑞
2
𝑖) 3 𝑏𝑜𝑦𝑠 𝑎𝑛𝑑 2 𝑔𝑖𝑟𝑙𝑠

𝑥 = 3, 𝑛 = 5

1 3 1 5−3
𝑃[𝑋 = 3] = 5𝑐3 ( ) ( )
2 2
1 3 1 2
= 10. ( ) ( )
2 2
= 0.3125
∴ 𝑁𝑜. 𝑜𝑓 𝑓𝑎𝑚𝑖𝑙𝑖𝑒𝑠 𝑜𝑢𝑡 𝑜𝑓 800 𝑤𝑖𝑡ℎ 𝑡ℎ𝑖𝑠 𝑐𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛

= 800 × 0.3125
= 250
𝑖𝑖) 5 𝑔𝑖𝑟𝑙𝑠 𝑎𝑛𝑑 0 𝑏𝑜𝑦𝑠

𝑥 = 0 ,𝑛 = 5

1 0 1 5
𝑃[𝑋 = 0] = 5𝑐0 ( ) ( )
2 2
1 5
=( )
2
= 0.03125
∴ 𝑁𝑜. 𝑜𝑓 𝑓𝑎𝑚𝑖𝑙𝑖𝑒𝑠 = 800 × 0.03125 = 25
𝑖𝑖𝑖) 𝑒𝑖𝑡ℎ𝑒𝑟 2𝐵 (𝑜𝑟) 3𝐵
2𝐵 & 3𝐺 (𝑜𝑟) 3𝐵 & 2𝐺
𝑋 = 2 (𝑜𝑟) 𝑋 = 3

1 2 1 3
𝑃[𝑋 = 2] = 5𝑐2 ( ) ( ) = 0.3125
2 2
1 3 1 2
𝑃[𝑋 = 3] = 5𝑐3 ( ) ( ) = 0.3125
2 2
𝑃[𝑋 = 2] + 𝑃[𝑋 = 3] = 0.3125 + 0.3125
= 0.6250
∴ 𝑁𝑜. 𝑜𝑓 𝑓𝑎𝑚𝑖𝑙𝑖𝑒𝑠 = 800 × 0.6250 = 500
𝑖𝑣) 𝑎𝑡𝑙𝑒𝑎𝑠𝑡 1 𝑏𝑜𝑦

𝑚𝑒𝑎𝑛𝑠 1𝐵&4𝐺, 2𝐵&3𝐺 , 3𝐵&2𝐺 , 4𝐵&1𝐺 , 5𝐵&0𝐺


(𝑜𝑟)

1 − 0𝐵&5𝐺

1 0 1 5
𝑃[𝑋 = 0] = 5𝑐0 ( ) ( ) = 0.03125
2 2
1 − 𝑃[𝑋 = 0] = 1 − 0.03125 = 0.9688
∴ 𝑁𝑜. 𝑜𝑓 𝑓𝑎𝑚𝑖𝑙𝑖𝑒𝑠 = 800 × 0.9688 = 775
𝑭𝒊𝒕𝒕𝒊𝒏𝒈 𝒐𝒇 𝑩𝒊𝒏𝒐𝒎𝒊𝒂𝒍 𝑫𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏:
1) 𝐹𝑖𝑡 𝑡ℎ𝑒 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑡𝑜 𝑡ℎ𝑒 𝑓𝑜𝑙𝑙𝑜𝑤𝑖𝑛𝑔 𝑑𝑎𝑡𝑎

𝑋 0 1 2 3 4 5 6

𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 13 25 52 58 32 16 4

𝑆𝑜𝑙: 𝐺𝑖𝑣𝑒𝑛 𝑛 = 6
∑ 𝑓𝑖 𝑥𝑖
𝑀𝑒𝑎𝑛 =
∑ 𝑓𝑖
(13 × 0) + (25 × 1) + (52 × 2) + (58 × 3) + (32 × 4) + (16 × 5) + (4 × 6)
=
13 + 25 + 52 + 58 + 32 + 16 + 4
25 + 104 + 174 + 128 + 80 + 24
=
200
535
= = 2.675
200
𝑛𝑝 = 2.675
⇒ 6𝑝 = 2.675
2.675
𝑝= = 0.4458
6
𝑞 = 1 − 𝑝 = 1 − 0.4458 = 0.5542

𝑁 = ∑ 𝑓𝑖 = 200

𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦: 𝑃(𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥

1) 𝑃 (𝑋 = 0) = 6𝑐0 (0.4458)0 (0.5542)6−0

= 0.0290
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 200 × 0.0290 = 5.795 ≅ 6
2) 𝑃 (𝑋 = 1) = 6𝑐1 (0.4458)1 (0.5542)6−1

= 0.1398
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 200 × 0.1398 = 27.96 ≅ 28
3) 𝑃 (𝑋 = 2) = 6𝑐2 (0.4458)2 (0.5542)6−2

= 0.2812
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 200 × 0.2812 = 56.42 ≅ 56

4) 𝑃 (𝑋 = 3) = 6𝑐3 (0.4458)3 (0.5542)6−3

= 0.3016
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 200 × 0.3016 = 60.3 ≅ 60
5) 𝑃 (𝑋 = 4) = 6𝑐4 (0.4458)4 (0.5542)6−4

= 0.1820
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 200 × 0.1820 = 36.400 ≅ 36
6) 𝑃 (𝑋 = 5) = 6𝑐5 (0.4458)5 (0.5542)6−5

= 0.0585
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 200 × 0.0585 = 11.7 ≅ 12
7) 𝑃 (𝑋 = 6) = 6𝑐6 (0.4458)6 (0.5542)6−6

= 0.0078
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 200 × 0.0078 = 1.56 ≅ 2

𝑇ℎ𝑒 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑠

𝑋 0 1 2 3 4 5 6

𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 13 25 52 58 32 16 4

𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 6 28 56 60 36 12 2

2) 𝐹𝑖𝑡 𝑎 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑡𝑜 𝑡ℎ𝑒 𝑓𝑜𝑙𝑙𝑜𝑤𝑖𝑛𝑔 𝑑𝑎𝑡𝑎

𝑋 0 1 2 3 4 5 6 7

𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 7 6 19 35 30 23 7 1

𝐶𝑎𝑙𝑐𝑢𝑙𝑎𝑡𝑒 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑖𝑒𝑠 𝑢𝑠𝑖𝑛𝑔 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛𝑠ℎ𝑖𝑝.

𝑆𝑜𝑙: 𝐻𝑒𝑟𝑒 𝑛 = 7
∑ 𝑓𝑖 𝑥𝑖
𝑀𝑒𝑎𝑛 =
∑ 𝑓𝑖
(0 × 7) + (1 × 6) + (2 × 19) + (3 × 35) + (4 × 30) + (5 × 23) + (6 × 7) + (7 × 1)
=
7 + 6 + 19 + 35 + 30 + 23 + 7 + 1
433
= = 3.3828
128
𝑤𝑒 ℎ𝑎𝑣𝑒 𝑛𝑝 = 3.3828
3.3828
⇒ 7𝑝 = 3.3828 ⇒ 𝑝 =
7
⇒ 𝑝 = 0.4833
𝑞 = 1 − 𝑝 = 1 − 0.4833

⇒ 𝑞 = 0.5167

𝑃 (𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥 . . . . . . . (1)


𝑛−𝑥 𝑝
𝑃 (𝑥 + 1) = . . 𝑝(𝑥) . . . . . . . (2)
𝑥+1 𝑞
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑖𝑒𝑠:
1) 𝑝𝑢𝑡 𝑥 = 0 𝑖𝑛 (1)

𝑝(0) = 7𝑐0 . 𝑝0 . 𝑞7−0

= 7𝑐0 . (0.4833)0 . (0.5167)7

= 0.0098

𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 128 × 0.0098 = 1.2544 ≅ 1


2) 𝑝𝑢𝑡 𝑥 = 0 𝑖𝑛 (2)
7 − 0 0.4833
𝑝(1) = . × 0.0098
0 + 1 0.5167
= 0.0641
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 128 × 0.0641 = 8.2132 ≅ 8
3) 𝑝𝑢𝑡 𝑥 = 1 𝑖𝑛 (2)
7 − 1 0.4833
𝑝(2) = . × 0.0641
1 + 1 0.5167
= 0.1802
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 128 × 0.1802 = 23.0656 ≅ 23
4) 𝑝𝑢𝑡 𝑥 = 2 𝑖𝑛 (2)
7 − 2 0.4833
𝑝(3) = . × 0.1802
2 + 1 0.5167
= 0.2810
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 128 × 0.2810 = 35.968 ≅ 36
5) 𝑝𝑢𝑡 𝑥 = 3 𝑖𝑛 (2)
7 − 3 0.4833
𝑝(4) = . × 0.2809
3 + 1 0.5167
= 0.2628

𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 128 × 0.2628 = 33.6384 ≅ 34


6) 𝑝𝑢𝑡 𝑥 = 4 𝑖𝑛 (2)
7 − 4 0.4833
𝑝(5) = . × 0.2628
4 + 1 0.5167
= 0.1475
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 128 × 0.1475 = 18.88 ≅ 19
7) 𝑝𝑢𝑡 𝑥 = 5 𝑖𝑛 (2)
7 − 5 0.4833
𝑝(6) = . × 0.1475
5 + 1 0.5167
= 0.0456
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 128 × 0.0456 = 5.8368 ≅ 6
8) 𝑝𝑢𝑡 𝑥 = 6 𝑖𝑛 (2)
7 − 6 0.4833
𝑝(7) = . × 0.0456
6 + 1 0.5167
= 0.0061
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 128 × 0.0061 = 0.7808 ≅ 1

𝑋 0 1 2 3 4 5 6 7

𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 7 6 19 35 30 23 7 1

𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 1 8 23 36 34 19 6 1

3) 𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝑎 𝑟𝑎𝑛𝑑𝑜𝑚 𝑒𝑥𝑝𝑒𝑟𝑖𝑚𝑒𝑛𝑡 𝑜𝑓 𝑡𝑜𝑠𝑠𝑖𝑛𝑔 7 𝑐𝑜𝑖𝑛𝑠 𝑎𝑡 𝑎 𝑡𝑖𝑚𝑒 𝑤ℎ𝑖𝑐ℎ 𝑎𝑟𝑒 𝑢𝑛𝑏𝑎𝑖𝑠𝑒𝑑.

𝐹𝑖𝑡 𝑡ℎ𝑒 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑓 𝑡ℎ𝑒 𝑛𝑜. 𝑜𝑓 ℎ𝑒𝑎𝑑𝑠 𝑓𝑜𝑙𝑙𝑜𝑤𝑠 𝑡ℎ𝑒 𝑓𝑜𝑙𝑙𝑜𝑤𝑖𝑛𝑔 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑖𝑒𝑠.

𝑋 0 1 2 3 4 5 6 7

𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 8 7 19 36 32 28 8 2

𝑆𝑜𝑙: 𝑛 = 7

∑ 𝑓𝑖 = 8 + 7 + 19 + 36 + 32 + 28 + 8 + 2

𝑁 = 140
1
𝑝=𝑞=
2
𝑃 (𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥
1 0 1 7−0
1) 𝑃 (0) = 7𝑐0 . ( ) . ( ) = 0.0078
2 2
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 140 × 0.0078 = 1.0920 ≅ 1

1 1 1 7−1
2) 𝑃 (1) = 7𝑐1 . ( ) . ( )
2 2
1 1 1 6
= 7𝑐1 . ( ) . ( )
2 2
= 0.0547
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 140 × 0.0547 = 7.6563 ≅ 8

1 2 1 5
3) 𝑃 (2) = 7𝑐2 . ( ) . ( )
2 2
7×6 1 7
= .( )
2×1 2
= 0.1641
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 140 × 0.1641 = 22.967 ≅ 23

1 3 1 7−3
4) 𝑃 (3) = 7𝑐3 . ( ) . ( )
2 2
= 0.2734
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 140 × 0.2734 = 38.28 ≅ 38

1 4 1 7−4
5) 𝑃 (4) = 7𝑐4 . ( ) . ( )
2 2
= 0.2734

𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 140 × 0.2734 = 38.25 ≅ 38

1 5 1 7−5
6) 𝑃 (5) = 7𝑐5 . ( ) . ( )
2 2
= 0.1641
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 140 × 0.1641 = 23.016 ≅ 23

1 6 1 7−6
7) 𝑃 (6) = 7𝑐6 . ( ) . ( )
2 2
= 0.0548
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 140 × 0.0548 = 7.6720 ≅ 8

1 7 1 7−7
8) 𝑃 (7) = 7𝑐7 . ( ) . ( )
2 2
= 0.0078
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 140 × 0.0078 = 1.0938 ≅ 1

∴ 𝑇ℎ𝑒 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑠

𝑋 0 1 2 3 4 5 6 7

𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 8 7 19 36 32 28 8 2

𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 1 8 23 38 38 23 8 1

4) 𝐼𝑛 𝑎 𝑏𝑖𝑛𝑜𝑚𝑖𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑐𝑜𝑛𝑡𝑎𝑖𝑛𝑖𝑛𝑔 5 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 𝑡𝑟𝑖𝑎𝑙𝑠, 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 1 𝑎𝑛𝑑 2

𝑎𝑟𝑒 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦 0.4096 𝑎𝑛𝑑 0.2048 𝑟𝑒𝑠𝑝𝑒𝑐𝑡𝑖𝑣𝑒𝑙𝑦. 𝐹𝐼𝑛𝑑 𝑡ℎ𝑒 𝑝𝑎𝑟𝑎𝑚𝑒𝑡𝑒𝑟𝑠 𝑜𝑓 𝑡ℎ𝑒 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛.

𝑆𝑜𝑙: 𝑃(𝑥) = 𝑛𝑐𝑥 . 𝑝 𝑥 . 𝑞𝑛−𝑥

𝑃 (1) = 5𝑐1 . 𝑝1 . 𝑞5−1 = 5𝑝𝑞4 = 0.4096 . . . . . . (1)

𝑃 (2) = 5𝑐2 . 𝑝2 . 𝑞5−2 = 10𝑝2 𝑞3 = 0.2048 . . . . . . . (2)

(1) 5𝑝𝑞4 0.4096


⇒ =
(2) 10𝑝2 𝑞3 0.2048
𝑞
= 2 ⇒ 𝑞 = 4𝑝
2𝑝
⇒ 1 − 𝑝 = 4𝑝

1
⇒ 5𝑝 = 1 ⇒ 𝑝 =
5
1 4
⇒𝑞 = 1−𝑝 = 1− =
5 5
4
𝑞=
5
1
∴ 𝑋~𝑏 (5, )
5

𝑷𝒐𝒊𝒔𝒔𝒐𝒏 𝑫𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏:

𝑒 −𝜆 . 𝜆𝑥
𝑃(𝑥 ) = 𝑤ℎ𝑒𝑟𝑒 𝑥 = 0,1,2, … …
𝑥!
𝐷𝑒𝑓𝑖𝑛𝑖𝑡𝑖𝑜𝑛: 𝐴 𝑟𝑎𝑛𝑑𝑜𝑚 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒 𝑋 𝑖𝑠 𝑠𝑎𝑖𝑑 𝑡𝑜 𝑓𝑜𝑙𝑙𝑜𝑤 𝑡ℎ𝑒 𝑃𝑜𝑖𝑠𝑠𝑜𝑛 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑓 𝑖𝑡𝑠
𝑒 −𝜆 . 𝜆𝑥
𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑚𝑎𝑠𝑠 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑖𝑠 𝑔𝑖𝑣𝑒𝑛 𝑏𝑦 𝑃 (𝑥 ) = 𝑓𝑜𝑟 𝑥 = 0,1,2, ….
𝑥!
𝑨𝒔𝒔𝒖𝒎𝒑𝒕𝒊𝒐𝒏𝒔 𝒖𝒏𝒅𝒆𝒓 𝒘𝒉𝒊𝒄𝒉 𝒕𝒉𝒆 𝒃𝒊𝒏𝒐𝒎𝒊𝒂𝒍 𝒅𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏 𝒕𝒆𝒏𝒅𝒔 𝒕𝒐 𝒑𝒐𝒊𝒔𝒔𝒐𝒏
1) 𝑇ℎ𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑡𝑟𝑖𝑎𝑙𝑠 𝑎𝑟𝑒 𝑣𝑒𝑟𝑦 ℎ𝑖𝑔ℎ 𝑖. 𝑒 𝑛 → ∞
2) 𝑇ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑠𝑢𝑐𝑐𝑒𝑠𝑠 𝑝 𝑖𝑠 𝑣𝑒𝑟𝑦 𝑠𝑚𝑎𝑙𝑙 𝑖. 𝑒 𝑝 → 0
3) 𝑛𝑝 = 𝜆 𝑖𝑠 𝑎𝑙𝑤𝑎𝑦𝑠 𝑓𝑖𝑛𝑖𝑡𝑒.
𝑴𝒆𝒂𝒏:

𝑥
𝑥 𝑥2
𝑒 = 1 + + +. . . . ..
1! 2!
𝑀𝑒𝑎𝑛 = 𝐸[𝑋]

= ∑ 𝑥 𝑃(𝑥 )
𝑥

𝑒 −𝜆 . 𝜆𝑥
= ∑ 𝑥.
𝑥!
𝑥=0

𝑒 −𝜆 . 𝜆𝑥
= ∑ 𝑥.
𝑥. (𝑥 − 1)!
𝑥=1

𝑒 −𝜆 . 𝜆𝑥
𝑀𝑒𝑎𝑛 = ∑
(𝑥 − 1)!
𝑥=1

𝐿𝑒𝑡 𝑥 − 1 = 𝑦 𝑡ℎ𝑒𝑛 𝑥 = 𝑦 + 1
𝑤ℎ𝑒𝑛 𝑥 = 1 𝑡ℎ𝑒𝑛 𝑦 = 0 𝑎𝑛𝑑 𝑥 = ∞ 𝑡ℎ𝑒𝑛 𝑦 = ∞

𝑒 −𝜆 . 𝜆𝑦+1
𝑀𝑒𝑎𝑛 = ∑
𝑦!
𝑦=0

−𝜆
𝜆𝑦
=𝑒 .𝜆 ∑
𝑦!
𝑦=0
−𝜆
𝜆 𝜆2
=𝑒 . 𝜆 [1 + + +. . . . . . ]
1! 2!

= 𝑒 −𝜆 . 𝜆. 𝑒 𝜆
∴ 𝑀𝑒𝑎𝑛 = 𝜆

𝑽𝒂𝒓𝒊𝒂𝒏𝒄𝒆:
2
𝜎 2 = 𝐸 [𝑋 2 ] − [𝐸 [𝑋]]

= ∑ 𝑥 2 𝑃 (𝑥 ) − 𝜆2
𝑥=0

𝑒 −𝜆 . 𝜆𝑥
2
= ∑𝑥 − 𝜆2
𝑥!
𝑥

𝑒 −𝜆 . 𝜆𝑥
= ∑𝑥 − 𝜆2
(𝑥 − 1)!
𝑥

𝑒 −𝜆 . 𝜆𝑥
= ∑[(𝑥 − 1) + 1] − 𝜆2
(𝑥 − 1). (𝑥 − 2)!
𝑥

𝑒 −𝜆 . 𝜆𝑥 𝑒 −𝜆 . 𝜆𝑥
= ∑(𝑥 − 1) +∑ − 𝜆2
(𝑥 − 1). (𝑥 − 2)! (𝑥 − 1)!
𝑥 𝑥
∞ ∞
𝑒 −𝜆 . 𝜆𝑥 𝑒 −𝜆 . 𝜆𝑥
=∑ +∑ − 𝜆2
(𝑥 − 2)! (𝑥 − 1)!
𝑥=2 𝑥=1

𝑒 −𝜆 . 𝜆𝑥
=∑ + 𝜆 − 𝜆2
(𝑥 − 2)!
𝑥=2

𝑝𝑢𝑡 𝑥 − 2 = 𝑦 𝑡ℎ𝑒𝑛 𝑥 = 𝑦 + 2
𝑤ℎ𝑒𝑛 𝑥 = 2 ⇒ 𝑦 = 0
𝑥=∞⇒𝑦=∞

𝑒 −𝜆 . 𝜆𝑦+2
𝑣𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = ∑ + 𝜆 − 𝜆2
𝑦!
𝑦=0

−𝜆
𝜆𝑦
=𝑒 . 𝜆 ∑ + 𝜆 − 𝜆2
2
𝑦!
𝑦=0

= 𝑒 −𝜆 . 𝜆2 . 𝑒 𝜆 + 𝜆 − 𝜆2
∴ 𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = 𝜆
𝑇ℎ𝑢𝑠 𝑀𝑒𝑎𝑛 = 𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = 𝜆
𝑴𝒐𝒎𝒆𝒏𝒕 𝑮𝒆𝒏𝒆𝒓𝒂𝒕𝒊𝒏𝒈 𝒇𝒖𝒏𝒄𝒕𝒊𝒐𝒏:
𝑀𝑋 (𝑡) = 𝐸 [𝑒 𝑡𝑋 ]

𝐸 [𝑒 𝑡𝑋 ] 𝑓𝑜𝑟 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒 ∑ 𝑥. 𝑃 (𝑥 )

= ∑ 𝑒 𝑡𝑥 . 𝑃(𝑥)
𝑥

𝑡𝑥
𝑒 −𝜆 . 𝜆𝑥
= ∑𝑒 .
𝑥!
𝑥

(𝜆𝑒 𝑡 ) 𝑥 𝑒 −𝜆
𝑀𝑋 (𝑡) = ∑
𝑥!
𝑥=0

−𝜆
(𝜆𝑒 𝑡 )𝑥
=𝑒 ∑
𝑥!
𝑥=0

−𝜆
𝜆𝑒 𝑡 (𝜆𝑒 𝑡 )2 (𝜆𝑒 𝑡 )3
=𝑒 [1 + + + +. . . .. . ]
1! 2! 3!
𝑡 𝑡
= 𝑒 −𝜆 [𝑒 𝜆𝑒 ] = 𝑒 −𝜆+𝜆𝑒
𝑡]
= 𝑒 −𝜆[1−𝑒
𝑡]
∴ 𝑀𝑋 (𝑡) = 𝑒 −𝜆[1−𝑒
𝑷𝒓𝒐𝒃𝒍𝒆𝒎𝒔:
1) 𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝑡ℎ𝑒 𝑒𝑥𝑎𝑚𝑝𝑙𝑒 𝑜𝑓 𝑝ℎ𝑜𝑛𝑒 𝑐𝑎𝑙𝑙𝑠 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 𝑎𝑡 𝑎 𝑠𝑝𝑒𝑐𝑖𝑓𝑖𝑐 𝑝𝑜𝑖𝑛𝑡 𝑖𝑛 𝑎 𝑔𝑖𝑣𝑒𝑛 𝑡𝑖𝑚𝑒𝑠𝑝𝑎
𝑇ℎ𝑒 𝑎𝑣𝑒𝑟𝑎𝑔𝑒 𝑜𝑓 𝑛𝑜. 𝑜𝑓 𝑝ℎ𝑜𝑛𝑒 𝑐𝑎𝑙𝑙𝑠 𝑝𝑒𝑟 𝑚𝑖𝑛𝑢𝑡𝑒 𝑐𝑜𝑚𝑖𝑛𝑔 𝑖𝑛𝑡𝑜 𝑎 𝑠𝑤𝑖𝑡𝑐ℎ 𝑏𝑜𝑎𝑟𝑑 𝑏𝑒𝑡𝑤𝑒𝑒𝑛 2𝑃𝑀
4𝑃𝑀 𝑖𝑠 2.5. 𝐷𝑒𝑡𝑒𝑟𝑚𝑖𝑛𝑒 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑡ℎ𝑎𝑡 𝑖𝑛 𝑡ℎ𝑎𝑡 𝑝𝑎𝑟𝑡𝑖𝑐𝑢𝑙𝑎𝑟 𝑡𝑖𝑚𝑒 𝑡ℎ𝑒 𝑛𝑜. 𝑜𝑓 𝑝ℎ𝑜𝑛𝑒 𝑐𝑎𝑙𝑙
𝑎𝑟𝑒 1) 4 (𝑜𝑟) 𝑓𝑒𝑤𝑒𝑟 2) 𝑀𝑜𝑟𝑒 𝑡ℎ𝑎𝑛 6 𝑐𝑎𝑙𝑙𝑠 𝑐𝑜𝑚𝑖𝑛𝑔 𝑖𝑛𝑡𝑜 𝑡ℎ𝑒 𝑠𝑤𝑖𝑡𝑐ℎ 𝑏𝑜𝑎𝑟𝑑.
𝑆𝑜𝑙: 𝐺𝑖𝑣𝑒𝑛 𝑡ℎ𝑎𝑡 𝜆 = 2.5
𝑊𝑒 ℎ𝑎𝑣𝑒 𝑎𝑣𝑒𝑟𝑎𝑔𝑒 = 𝑚𝑒𝑎𝑛 = 𝜆
1) 𝑃[𝑋 ≤ 4] = 𝑃 [𝑋 = 0] + 𝑃[𝑋 = 1] + 𝑃[𝑋 = 2] + 𝑃 [𝑋 = 3] + 𝑃 [𝑋 = 4]
𝑒 −2.5(2.5)0 𝑒 −2.5(2.5)1 𝑒 −2.5(2.5)2 𝑒 −2.5(2.5)3 𝑒 −2.5(2.5)4
= + + + +
0! 1! 2! 3! 4!

−2.5
(2.5)2 (2.5)3 (2.5)4
=𝑒 [1 + 2.5 + + + ]
2! 3! 4!
= 0.8912
∴ 𝑃[𝑋 ≤ 4] = 0.8912
2) 𝑃[𝑋 > 6] = 1 − 𝑃 [𝑋 ≤ 6]
= 1 − [𝑃[𝑋 = 0] + 𝑃[𝑋 = 1] + 𝑃 [𝑋 = 2] + 𝑃 [𝑋 = 3] + 𝑃 [𝑋 = 4] + 𝑃[𝑋 = 5]
+ 𝑃[𝑋 = 6]]
𝑒 −2.5(2.5)0 𝑒 −2.5 (2.5)1 𝑒 −2.5(2.5)2 𝑒 −2.5(2.5)3 𝑒 −2.5(2.5)4
=1−[ + + + +
0! 1! 2! 3! 4!
𝑒 −2.5(2.5)5 𝑒 −2.5(2.5)6
+ + ]
5! 6!

𝑒 −𝜆 𝜆5 𝑒 −𝜆 𝜆6
= 1 − [0.8912 + + ]
5! 6!

= 0.1088 − 𝑒 −𝜆 [0.8138 + 0.3391]


= 0.1088 − 𝑒 −2.5[1.1529]
= 0.1088 − 0.0946
= 0.0142
∴ 𝑃[𝑋 > 6] = 0.0142
𝑹𝒆𝒄𝒖𝒓𝒓𝒆𝒏𝒄𝒆 𝑹𝒆𝒍𝒂𝒕𝒊𝒐𝒏𝒔𝒉𝒊𝒑 𝒃𝒆𝒕𝒘𝒆𝒆𝒏 𝑷𝒓𝒐𝒃𝒂𝒃𝒊𝒍𝒊𝒕𝒊𝒆𝒔:
𝑒 −𝜆 . 𝜆𝑥
𝑃 (𝑥 ) = . . . . . . (1)
𝑥!
𝑒 −𝜆 . 𝜆𝑥+1
𝑃 (𝑥 + 1) = . . . . . . (2)
(𝑥 + 1)!
𝑒 −𝜆 . 𝜆𝑥+1
(2) 𝑃(𝑥 + 1) (𝑥 + 1)!
⇒ = −𝜆 𝑥
(1) 𝑃(𝑥 ) 𝑒 .𝜆
𝑥!
𝑃(𝑥 + 1) 𝑒 −𝜆 . 𝜆𝑥 . 𝜆 𝑥!
⇒ = . −𝜆 𝑥
𝑃(𝑥 ) 𝑥! (𝑥 + 1) 𝑒 . 𝜆
𝑃(𝑥 + 1) 𝜆
⇒ =
𝑃(𝑥 ) 𝑥+1
𝜆
∴ 𝑃 (𝑥 + 1) = 𝑃 (𝑥 ) , 𝑥 = 0,1,2,3, . . . . .
𝑥+1
1) 𝐼𝑓 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑡ℎ𝑎𝑡 𝑎𝑛 𝑖𝑛𝑑𝑖𝑣𝑖𝑑𝑢𝑎𝑙 𝑠𝑢𝑓𝑓𝑒𝑟𝑠 𝑤𝑖𝑡ℎ 𝑎 𝑏𝑎𝑑 𝑟𝑒𝑎𝑐𝑡𝑖𝑜𝑛 𝑓𝑟𝑜𝑚 𝑎𝑛 𝑖𝑛𝑗𝑒𝑐𝑡𝑖𝑜𝑛
0.001. 𝐷𝑒𝑡𝑒𝑟𝑚𝑖𝑛𝑒 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑡ℎ𝑎𝑡 𝑜𝑢𝑡 𝑜𝑓 2000 𝑖𝑛𝑑𝑖𝑣𝑖𝑑𝑢𝑎𝑙𝑠 𝑖) 𝑒𝑥𝑎𝑐𝑡𝑙𝑦 3 𝑖𝑖) 𝑀𝑜𝑟𝑒 𝑡ℎ𝑎
𝑖𝑖𝑖) 𝑁𝑜𝑛𝑒 𝑖𝑣) 𝑀𝑜𝑟𝑒 𝑡ℎ𝑎𝑛 1 𝑖𝑛𝑑𝑖𝑣𝑖𝑑𝑢𝑎𝑙 𝑠𝑢𝑓𝑓𝑒𝑟𝑠 𝑤𝑖𝑡ℎ 𝑏𝑎𝑑 𝑟𝑒𝑎𝑐𝑡𝑖𝑜𝑛 𝑜𝑓 𝑡ℎ𝑒 𝑖𝑛𝑗𝑒𝑐𝑡𝑖𝑜𝑛
𝑆𝑜𝑙: 𝑆𝑖𝑛𝑐𝑒 𝑝 = 0.001 𝑣𝑒𝑟𝑦 𝑠𝑚𝑎𝑙𝑙 𝑎𝑛𝑑 𝑛 = 2000 𝑖𝑠 𝑣𝑒𝑟𝑦 𝑙𝑎𝑟𝑔𝑒 𝑏𝑦 𝑎𝑠𝑠𝑢𝑚𝑝𝑡𝑖𝑜𝑛
𝑀𝑒𝑎𝑛 𝜆 = 𝑛𝑝
= 0.001 × 2000
⇒ 𝜆=2
𝑒 −𝜆 . 𝜆𝑥
𝑃 (𝑥 ) =
𝑥!
𝑒 −2. (2)3
1) 𝑃[𝑋 = 3] = = 0.1804
3!
2) 𝑃[𝑋 > 2] = 1 − 𝑃 [𝑋 ≤ 2]
= 1 − [𝑃[𝑋 = 0] + 𝑃[𝑋 = 1] + 𝑃 [𝑋 = 2]]
𝑒 −2. (2)0 𝑒 −2. (2)1 𝑒 −2. (2)2
=1−[ + + ]
0! 1! 2!
= 0.3233
𝑒 −2. (2)0
3) 𝑃[𝑋 = 0] = = 0.1353
0!
4) 𝑃[𝑋 > 1] = 1 − 𝑃 [𝑋 ≤ 1]
= 1 − [[𝑃[𝑋 = 0] + 𝑃 [𝑋 = 1]]
𝑒 −2. (2)0 𝑒 −2. (2)1
=1−[ + ]
0! 1!
= 0.5940
2) 𝐴 𝑐𝑎𝑟 ℎ𝑖𝑟𝑒 𝑓𝑖𝑟𝑚 ℎ𝑎𝑠 2 𝑐𝑎𝑟𝑠 𝑤ℎ𝑖𝑐ℎ ℎ𝑖𝑡 ℎ𝑖𝑟𝑒𝑠 𝑜𝑢𝑡 𝑒𝑣𝑒𝑟𝑦𝑑𝑎𝑦 𝑡ℎ𝑒 𝑛𝑜. 𝑜𝑓 𝑑𝑒𝑚𝑎𝑛𝑑𝑠 𝑜𝑛 𝑒𝑎𝑐ℎ
𝑑𝑎𝑦 𝑎𝑟𝑒 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑒𝑑 𝑤𝑖𝑡ℎ 𝑝𝑜𝑖𝑠𝑜𝑛 ℎ𝑎𝑣𝑖𝑛𝑔 𝑎 𝑚𝑒𝑎𝑛 1.5. 𝐶𝑎𝑙𝑐𝑢𝑙𝑎𝑡𝑒 𝑡ℎ𝑒 𝑝𝑟𝑜𝑝𝑜𝑟𝑡𝑖𝑜𝑛 𝑜𝑓 𝑑𝑎𝑦𝑠 𝑖𝑛
𝑦𝑒𝑎𝑟 𝑜𝑛 𝑤ℎ𝑖𝑐ℎ 𝑖) 𝑡ℎ𝑒𝑟𝑒 𝑖𝑠 𝑛𝑜 𝑑𝑒𝑚𝑎𝑛𝑑 𝑖𝑖) 𝑜𝑛 𝑤ℎ𝑖𝑐ℎ 𝑡ℎ𝑒 𝑑𝑒𝑎𝑚𝑛𝑑 𝑖𝑠 𝑟𝑒𝑗𝑒𝑐𝑡𝑒𝑑
𝑆𝑜𝑙: 𝐿𝑒𝑡 𝑋 𝑟𝑒𝑝𝑟𝑒𝑠𝑒𝑛𝑡𝑠 𝑡ℎ𝑒 𝑛𝑜. 𝑜𝑓 𝑑𝑒𝑚𝑎𝑛𝑑𝑠 𝑟𝑒𝑞𝑢𝑖𝑟𝑒𝑑 𝑜𝑛 𝑎 𝑑𝑎𝑦
𝐺𝑖𝑣𝑒𝑛 𝜆 = 1.5
𝑒 −1.5. (1.5)0
𝑖) 𝑃[𝑋 = 0] = = 0.2231
0!
𝑝𝑟𝑜𝑝𝑜𝑟𝑡𝑖𝑜𝑛 𝑜𝑓 𝑑𝑎𝑦𝑠 𝑖𝑛 𝑎 𝑦𝑒𝑎𝑟 𝑜𝑛 𝑤ℎ𝑖𝑐ℎ 𝑡ℎ𝑒𝑟𝑒 𝑖𝑠 𝑛𝑜 𝑑𝑒𝑚𝑎𝑛𝑑 = 0.2231 × 365
= 81.4425 ≅ 81 𝑑𝑎𝑦𝑠
𝑖𝑖) 𝑃[𝑋 > 2] = 1 − 𝑃[𝑋 ≤ 2]
= 1 − [[𝑃[𝑋 = 0] + 𝑃 [𝑋 = 1] + 𝑃 [𝑋 = 2]]
𝑒 −1.5. (1.5)1 𝑒 −1.5 . (1.5)2
= 1 − [0.2231 + + ]
1! 2!
= 0.1912
𝑝𝑟𝑜𝑝𝑜𝑟𝑡𝑖𝑜𝑛 𝑜𝑓 𝑑𝑎𝑦𝑠 𝑜𝑛 𝑤ℎ𝑖𝑐ℎ 𝑡ℎ𝑒 𝑑𝑒𝑚𝑎𝑛𝑑 𝑖𝑠 𝑟𝑒𝑗𝑒𝑐𝑡𝑒𝑑 = 0.1912 × 365
= 69.7819 ≅ 70 𝑑𝑎𝑦𝑠
3) 𝐴𝑣𝑒𝑟𝑎𝑔𝑒 𝑜𝑓 𝑎𝑐𝑐𝑖𝑑𝑒𝑛𝑡𝑠 𝑜𝑛 𝑎 𝑟𝑜𝑎𝑑 𝑖𝑠 1.8. 𝐷𝑒𝑡𝑒𝑟𝑚𝑖𝑛𝑒 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑡ℎ𝑎𝑡 𝑡ℎ𝑒 𝑛𝑜. 𝑜𝑓
𝑎𝑐𝑐𝑖𝑑𝑒𝑛𝑡𝑠 𝑎𝑟𝑒 𝑖) 𝑎𝑡𝑙𝑒𝑎𝑠𝑡 1 𝑖𝑖) 𝑎𝑡𝑚𝑜𝑠𝑡 1
𝑆𝑜𝑙: 𝐺𝑖𝑣𝑒𝑛 𝑡ℎ𝑎𝑡 𝜆 = 1.8
𝑖) 𝑃[𝑋 ≥ 1] = 1 − 𝑃 [𝑋 < 1]
= 1 − 𝑃[𝑋 < 0]
𝑒 −1.8. (1.8)0
=1−
0!
∴ 𝑃[𝑋 ≥ 1] = 0.8347
𝑖𝑖) 𝑃[𝑋 ≤ 1] = 𝑃[𝑋 = 0] + 𝑃 [𝑋 = 1]
𝑒 −1.8. (1.8)0 𝑒 −1.8. (1.8)1
= +
0! 1!
∴ 𝑃[𝑋 ≤ 1] = 0.4628
4) 𝐼𝑓 𝑋 𝑖𝑠 𝑎 𝑝𝑜𝑖𝑠𝑠𝑜𝑛 𝑣𝑎𝑟𝑖𝑎𝑡𝑒 𝑠𝑢𝑐ℎ 𝑡ℎ𝑎𝑡 𝑃 [𝑋 = 0]
= 𝑃[𝑋 = 1] 𝑡ℎ𝑒𝑛 𝑖) 𝑓𝑖𝑛𝑑 𝑡ℎ𝑒 𝑝𝑎𝑟𝑎𝑚𝑒𝑡𝑒𝑟 𝑜𝑓
𝑡ℎ𝑒 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑖𝑖) 𝐹𝑖𝑛𝑑 𝑡ℎ𝑒 𝑃 [𝑋
= 0] 𝑖𝑖𝑖) 𝑈𝑠𝑒 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑠ℎ𝑖𝑝 𝑓𝑜𝑟 𝑓𝑖𝑛𝑑𝑖𝑛𝑔 𝑡ℎ𝑒
𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑖𝑒𝑠 𝑤𝑖𝑡ℎ 𝑥 = 0,1,2,3,4
𝑆𝑜𝑙: 𝑖) 𝐺𝑖𝑣𝑒𝑛 𝑡ℎ𝑎𝑡 𝑃 [𝑋 = 0] = 𝑃[𝑋 = 1]
𝑒 −𝜆 . 𝜆0 𝑒 −𝜆 . 𝜆1
𝑃 [𝑋 = 0] = . . . . . (1) , 𝑃[𝑋 = 1] = . . . . . (2)
0! 1!
𝑒 −𝜆 . 𝜆0 𝑒 −𝜆 . 𝜆1
(1) = (2) ⇒ =
0! 1!
⇒ 𝜆=1
𝑒 −1. 10
𝑖𝑖) 𝑃[𝑋 = 0] = = 0.3679
0!
𝜆
𝑖𝑖𝑖) 𝑢𝑠𝑖𝑛𝑔 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛 𝑃(𝑥 + 1) = 𝑃 (𝑥 )
𝑥+1
1
𝑎𝑡 𝑥 = 0 , 𝑃 (1) = 𝑃(0) = 0.3679
0+1
1 1
𝑎𝑡 𝑥 = 1 , 𝑃 (2) = 𝑃(1) = (0.3679) = 0.1840
1+1 2
1 1
𝑎𝑡 𝑥 = 2 , 𝑃 (3) = 𝑃(2) = (0.1840) = 0.0613
2+1 3
1 1
𝑎𝑡 𝑥 = 3 , 𝑃 (4) = 𝑃(3) = (0.0613) = 0.0153
3+1 4
1 1
𝑎𝑡 𝑥 = 4 , 𝑃 (5) = 𝑃(4) = (0.0153) = 0.0031
4+1 5
3
5) 𝐼𝑓 𝑋 𝑖𝑠 𝑎 𝑝𝑜𝑖𝑠𝑠𝑜𝑛 𝑣𝑎𝑟𝑖𝑎𝑡𝑒 𝑠𝑢𝑐ℎ 𝑡ℎ𝑎𝑡 𝑃 [𝑋 = 1]
2
= 𝑃[𝑋 = 3]. 𝐹𝑖𝑛𝑑 𝑖) 𝜆 𝑎𝑛𝑑 ℎ𝑒𝑛𝑐𝑒 𝑓𝑖𝑛𝑑 𝑡ℎ𝑒
𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 [𝑋 ≥ 1] 𝑖. 𝑒 𝑃[𝑋 ≥ 1] 𝑖𝑖𝑖) 𝑃 [𝑋 ≤ 3] 𝑖𝑣) 𝑃[2 ≤ 𝑋 ≤ 5]
3
𝑆𝑜𝑙: 𝑖) 𝐺𝑖𝑣𝑒𝑛 𝑡ℎ𝑎𝑡 𝑃[𝑋 = 1] = 𝑃[𝑋 = 3]
2
3 𝑒 −𝜆 . 𝜆1 𝑒 −𝜆 . 𝜆3
=
2 1! 3!
⇒ 𝜆2 = 9
⇒ 𝜆 = ±3
∴ 𝜆=3
𝑖𝑖) 𝑃[𝑋 ≥ 1] = 1 − 𝑃[𝑋 < 1]
= 1 − 𝑃[𝑋 = 0]
𝑒 −3. 30
=1−
0!
∴ 𝑃[𝑋 ≥ 1] = 0.9502
𝑖𝑖𝑖)𝑃[𝑋 ≤ 3] = 𝑃 [𝑋 = 0] + 𝑃[𝑋 = 1] + 𝑃[𝑋 = 2] + 𝑃 [𝑋 = 3]
𝑒 −3. 30 𝑒 −3 . 31 𝑒 −3. 32 𝑒 −3. 33
= + + +
0! 1! 2! 3!

−3 −3
𝑒 −3. 9 𝑒 −3. 27
=𝑒 +𝑒 .3+ +
2 6
= 0.6472
𝑖𝑣) 𝑃[2 ≤ 𝑋 ≤ 5] = 𝑃 [𝑋 = 2] + 𝑃 [𝑋 = 3] + 𝑃[𝑋 = 4] + 𝑃 [𝑋 = 5]
𝑒 −3. 32 𝑒 −3 . 33 𝑒 −3. 34 𝑒 −3. 35
= + + +
2! 3! 4! 5!
= 0.7169
6) 2 𝑐𝑎𝑟𝑑𝑠 𝑎𝑟𝑒 𝑑𝑟𝑎𝑤𝑛 𝑓𝑟𝑜𝑚 𝑎 𝑤𝑒𝑙𝑙 𝑠ℎ𝑢𝑓𝑓𝑙𝑒𝑑 𝑝𝑎𝑐𝑘 𝑜𝑓 52 𝑐𝑎𝑟𝑑𝑠 𝑤ℎ𝑖𝑐ℎ 𝑎𝑟𝑒 𝑑𝑖𝑎𝑚𝑜𝑛𝑑𝑠
𝑢𝑠𝑖𝑛𝑔 𝑝𝑜𝑠𝑠𝑖𝑜𝑛 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑓𝑖𝑛𝑑 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑔𝑒𝑡𝑡𝑖𝑛𝑔 2 𝑑𝑖𝑎𝑚𝑜𝑛𝑑𝑠 𝑎𝑡𝑙𝑒𝑎𝑠𝑡 3
𝑡𝑖𝑚𝑒𝑠 𝑖𝑛 51 𝑐𝑜𝑛𝑠𝑒𝑐𝑢𝑡𝑖𝑣𝑒 𝑡𝑟𝑖𝑎𝑙𝑠.
𝑆𝑜𝑙: 𝑇ℎ𝑒 𝑛𝑜. 𝑜𝑓 𝑡𝑟𝑖𝑎𝑙𝑠 𝑛 = 51
13 × 12
13𝐶2
𝑝= = 1 × 2 = 0.0588
52𝐶2 52 × 51
1×2
𝜆 = 𝑛𝑝 = 51 × 0.0588 = 3
∴ 𝜆=3
𝐴𝑡𝑙𝑒𝑎𝑠𝑡 3: 𝑃 [𝑋 ≥ 3] = 1 − 𝑃 [𝑋 < 3]
= 1 − [𝑃[𝑋 = 0] + 𝑃[𝑋 = 1] + 𝑃 [𝑋 = 2]]
𝑒 −3. 30 𝑒 −3. 31 𝑒 −3. 32
=1−[ + + ]
0! 1! 2!

−3 −3
𝑒 −3 . 9
= 1 − [𝑒 +𝑒 .3+ ]
2
= 0.5768
∴ 𝑃[𝑋 ≥ 3] = 0.5768
𝑭𝒊𝒕𝒕𝒊𝒏𝒈 𝒐𝒇 𝒑𝒐𝒊𝒔𝒔𝒐𝒏 𝒅𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏:
1) 𝐹𝑖𝑡 𝑡ℎ𝑒 𝑝𝑜𝑖𝑠𝑠𝑜𝑛 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑡𝑜 𝑡ℎ𝑒 𝑓𝑜𝑙𝑙𝑜𝑤𝑖𝑛𝑔 𝑑𝑎𝑡𝑎

𝑋 0 1 2 3 4 5

𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 142 156 69 27 5 1

𝑆𝑜𝑙: ∑ 𝑓𝑖 = 400

∑ 𝑓𝑖 𝑥𝑖 0 × 142 + 1 × 156 + 2 × 69 + 3 × 27 + 4 × 5 + 5 × 1
𝑀𝑒𝑎𝑛 (𝜆) = = =1
∑ 𝑓𝑖 400

∴ 𝜆=1

𝑒 −𝜆 . 𝜆𝑥
𝑃 (𝑥) =
𝑥!
𝑬𝒙𝒑𝒆𝒄𝒕𝒆𝒅 𝒇𝒓𝒆𝒒𝒖𝒆𝒏𝒄𝒊𝒆𝒔:

𝑒 −1 . 10
1) 𝑃[𝑋 = 0] = = 0.3679
0!
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.3679 × 400 = 147.15 ≅ 147

𝑒 −1 . 11
2) 𝑃[𝑋 = 1] = = 0.3679
1!
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.3679 × 400 = 147.15 ≅ 147

𝑒 −1 . 12
3) 𝑃[𝑋 = 2] = = 0.1839
2!
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.1839 × 400 = 73.57 ≅ 74
𝑒 −1 . 13
4) 𝑃[𝑋 = 3] = = 0.0613
3!
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.0613 × 400 = 24.52 ≅ 25
𝑒 −1 . 14
5) 𝑃[𝑋 = 4] = = 0.0153
4!
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.0153 × 400 = 6.13 ≅ 6

𝑒 −1 . 15
6) 𝑃[𝑋 = 5] = = 0.0031
5!
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.0031 × 400 = 1.24 ≅ 1

𝑋 0 1 2 3 4 5
𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 142 156 69 27 5 1

𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 147 147 74 25 6 1

2) 𝐹𝑖𝑡 𝑡ℎ𝑒 𝑝𝑜𝑖𝑠𝑠𝑜𝑛 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑡𝑜 𝑡ℎ𝑒 𝑓𝑜𝑙𝑙𝑜𝑤𝑖𝑛𝑔 𝑡𝑎𝑏𝑙𝑒 𝑏𝑦 𝑐𝑎𝑙𝑐𝑢𝑙𝑎𝑡𝑖𝑛𝑔 𝑡ℎ𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑖𝑒𝑠
𝑢𝑠𝑖𝑛𝑔 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛.

𝑋 0 1 2 3 4

𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 109 65 22 3 1

𝑆𝑜𝑙: ∑ 𝑓𝑖 = 200

∑ 𝑓𝑖 𝑥𝑖 0 × 109 + 1 × 65 + 2 × 22 + 3 × 3 + 4 × 1
𝑀𝑒𝑎𝑛 (𝜆) = = = 0.6100
∑ 𝑓𝑖 200

∴ 𝜆 = 0.61

𝑒 −𝜆 . 𝜆𝑥
𝑃 (𝑥) =
𝑥!
𝑬𝒙𝒑𝒆𝒄𝒕𝒆𝒅 𝒇𝒓𝒆𝒒𝒖𝒆𝒏𝒄𝒊𝒆𝒔:
𝑒 −0.61 . (0.61)0
1) 𝑃[𝑋 = 0] = = 0.5434
0!
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.5434 × 200 = 108.68 ≅ 109
𝜆
𝑤𝑒 ℎ𝑎𝑣𝑒 𝑃 (𝑥 + 1) = 𝑃(𝑥)
𝑥+1
𝑝𝑢𝑡 𝑥 = 0 𝑖𝑛 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛
0.61
2) 𝑃[𝑋 = 1] = (0.5434) = 0.3314
0+1
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.3314 × 200 = 66.29 ≅ 66

𝑝𝑢𝑡 𝑥 = 1 𝑖𝑛 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛


0.61
3) 𝑃[𝑋 = 2] = (0.3314) = 0.1010
1+1
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.1010 × 200 = 20.2 ≅ 20

𝑝𝑢𝑡 𝑥 = 2 𝑖𝑛 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛


0.61
4) 𝑃[𝑋 = 3] = (0.1010) = 0.0205
2+1
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.0205 × 200 = 4.1 ≅ 4
𝑝𝑢𝑡 𝑥 = 3 𝑖𝑛 𝑟𝑒𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛
0.61
5) 𝑃[𝑋 = 4] = (0.0205) = 0.0031
3+1
𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 = 0.0031 × 200 = 0.6253 ≅ 1

𝑋 0 1 2 3 4

𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 109 65 22 3 1

𝐸𝑥𝑝𝑒𝑐𝑡𝑒𝑑 𝑓𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 109 65 20 4 1

𝑷𝒐𝒊𝒔𝒔𝒐𝒏 𝑫𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏 𝒂𝒔 𝒂 𝒍𝒊𝒎𝒊𝒕𝒊𝒏𝒈 𝒄𝒂𝒔𝒆 𝒐𝒇 𝑩𝒊𝒏𝒐𝒎𝒊𝒂𝒍 𝑫𝒊𝒔𝒕𝒓𝒊𝒃𝒖𝒕𝒊𝒐𝒏:


𝜆 𝜆
𝐿𝑒𝑡 𝑛𝑝 = 𝜆 ⇒ 𝑛 = 𝑎𝑛𝑑 𝑝 = . . . . . . . (1)
𝑝 𝑛
𝐶𝑜𝑛𝑠𝑖𝑑𝑒𝑟 𝑝(𝑥) = 𝑛𝐶𝑥 𝑝 𝑥 𝑞𝑛−𝑥
𝑛!
= 𝑝 𝑥 [1 − 𝑝]𝑛−𝑥
𝑥! (𝑛 − 𝑥)!
1.2.3 … … (𝑛 − 𝑥)(𝑛 − 𝑥 + 1) … … 𝑛 𝑥
= 𝑝 [1 − 𝑝]𝑛−𝑥
𝑥! 1.2.3 … … (𝑛 − 𝑥)
𝑛(𝑛 − 1). . . . . (𝑛 − 𝑥 + 1) 𝑥
= 𝑝 [1 − 𝑝]𝑛−𝑥
𝑥!
𝜆 𝜆 𝜆 𝜆 𝑛
( ( )
𝑝 𝑝 − 1) . . . . . (𝑝 − 𝑥 − 1 ) 𝑝 𝑥 [1 − 𝑛]
=
𝑥! [1 − 𝑝]𝑥
𝜆 𝜆−𝑝 𝜆 − (𝑥 − 1)𝑝 𝑥 𝑛
[ ]..... .[ ] 𝑝 [1 − 𝜆 ]
𝑝 𝑝 𝑝 𝑛
=
𝑥! [1 − 𝑝]𝑥

𝑥 [1 𝜆 𝑛
[𝜆 ( )
𝜆[𝜆 − 𝑝]. . . . . − 𝑥 − 1 𝑝] 𝑝 − ]
= 𝑛
𝑝 𝑥 . 𝑥! [1 − 𝑝]𝑥

𝜆 𝑛
𝜆[𝜆 − 𝑝]. . . . . [𝜆 − (𝑥 − 1)𝑝] [1 − 𝑛]
lim 𝑝(𝑥) = lim
𝑝→0 𝑝→0 𝑥! [1 − 𝑝]𝑥

𝜆𝑥 𝜆 𝑛
= [1 − ]
𝑥! 𝑛
𝜆𝑥 𝜆 𝑛
lim 𝑝(𝑥) = lim [1 − ]
𝑝→0 𝑛→∞ 𝑥! 𝑛
𝑛→∞
𝑛
(−𝜆)
𝜆𝑥 𝜆 −𝜆
= lim [1 − ]
𝑥! 𝑛→∞ 𝑛
𝜆𝑥 −𝜆 𝑒 −𝜆 𝜆𝑥
lim 𝑝(𝑥) = .𝑒 = = 𝑝(𝑥)
𝑝→0 𝑥! 𝑥!
𝑛→∞

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy