par-est
par-est
February 2025
2
Plan
Sampling distribution of sample mean
Central limit theorem
Sampling distribution of sample variance
3
Introduction
4
Estimation
5
Estimation
6
Point estimates
7
Maximum likelihood estimators
8
Maximum likelihood estimators
parameter vector
The likelihood function of θ
n
L(θ
θ | x ) = f (x1 , … , xn ; θ ) = ∏ f (xi ; θ )
i=1
^
θ = arg max L(θ
θ | x)
θ ∈Ω
9
Exponential distribution
−xi /θ n
L(θ | x) = ∏(1/θ) e = (1/θ ) exp (− ∑ xi /θ)
i=1 i=1
i=1
10
Exponential distribution
So we can write
d ∣
log L(θ | x )∣ = 0
dθ ∣ ^
θ=θ
n n
n ∑ xi
i=1 ^
− + = 0 ⇒ θ = ∑ xi /n = x̄
2
^ ^
θ θ i=1
11
Exponential distribution
12
Bernoulli distribution
i=1
n n
log L(p |x
x) = ∑ xi log p + (n − ∑ xi ) log(1 − p)
i=1 i=1
13
Bernoulli distribution
14
Poisson distribution
L(λ |x
x) = ∏ =
xi ! ∏ x1 !
i=1
log L(λ |x
x) = −nλ + ∑ xi log λ − log ∏ x1 !
i=1
15
The normal distribution
2
1 −
1
(xi −μ)
2
2 −n/2 −
1
∑
n
(xi −μ)
2
2 2 i=1
L(μ, σ |x
x) = ∏ e 2σ
= (2πσ ) e 2σ
√2πσ 2
i=1
n
n 2
1 2
log L(μ, σ |x
x) = − ( log 2π + log σ ) − ∑(xi − μ)
2
2 2σ
i=1
16
The normal distribution
n
d log L(μ, σ |x
x) n 1
2
= − + ∑(xi − μ)
2 2 2 2
dσ 2σ 2(σ )
i=1
17
The normal distribution
n
x) ∣
d log L(μ, σ |x 1
∣ = ^ = 0
∑(xi − μ)
2
dμ ∣ σ
^
μ=μ,σ=
^ σ
^ i=1
⇒ ^ = x̄
μ
n
x) ∣
d log L(μ, σ |x n 1
2
∣ = − + ^
∑(xi − μ) = 0
2 2
dσ ∣ 2σ
^ 2(σ
^ )
2
μ=μ,σ=
^ σ
^ i=1
2
1
2
⇒ σ
^ = ^
∑(xi − μ)
n
i=1
18
Interval estimates
19
Interval estimates
20
Interval estimate of population mean
Sample mean X ¯
is an point
estimate of μ
Sampling distribution
X̄ ∼ N (μ, σ /n) and
2
¯
X − μ
Z =
σ/√n P ( − zα/2 < Z < zα/2 ) = 1 − α
21
Interval estimate of population mean
X̄ − μ
P ( − zα/2 < < zα/2 ) = 1 − α
σ/√n
¯ ¯
P (X − zα/2 (σ/√n) < μ < X + zα/2 (σ/√n)) = 1 − α
X̄ ± zα/2 (σ/√n)
22
Interval estimate of population mean
X̄ ± 1.96(σ/√n)
23
Interval estimate of population mean
( − ∞, X̄ + zα (s/√n))
(X̄ − zα (s/√n), ∞)
24
Example 7.3a
25
Example 7.3a
9
= 9
¯ ¯
X − μ X − μ
Z = ⇒ t =
σ/√n s/√n
27
Example 7.3a
2
5, 8.5, 12, 15, 7, 9, 7.5, 6.5, 10.5 s = 9.5
28
Estimating in difference in means of two normal populations
29
Estimating in difference in means of two normal populations
^ ¯ ^ ¯ ^ ^ ¯ ¯
μ = X = (1/n) ∑ Xi , μ = Y = (1/m) ∑ Yi , μ − μ = X − Y
1 2 1 2
i i
Sampling distributions
¯ 2
X ∼ N (μ1 , σ /n)
1
¯ ∼ N (μ , σ 2 /m)
Y 2 2
2 2
σ σ
¯ ¯ 1 2
X − Y ∼ N (μ1 − μ2 , + )
n m
30
Estimating in difference in means of two normal populations
Since
¯ ¯ 2 2
X − Y ∼ N (μ1 − μ2 , (σ /n + σ /m))
1 2
¯ ¯
X − Y − (μ1 − μ2 )
Z = ∼ N (0, 1)
2 2
σ σ
√ 1
+
2
n m
It can be shown
¯ ¯
X − Y − (μ1 − μ2 )
P { − zα/2 < < zα/2 } = 1 − α
2 2
σ σ
√ 1
+
2
n m
31
Estimating in difference in means of two normal populations
2 2
σ σ
1 2
(x̄ − ȳ ) ± zα/2 √ +
n m
2 2
σ σ 1 1
(x̄ − ȳ ) ± zα/2 √ + = (x̄ − ȳ ) ± zα/2 σ√ +
n m n m
32
Estimating in difference in means of two normal populations
the data as
2 2
(n − 1)S + (m − 1)S
2 1 2
Sp =
n + m − 2
1 1
(x̄ − ȳ ) ± tα/2,n+m−2 Sp √ +
n m
33
Approximmate confidence interval for the mean of a Bernoulli random
variable
34
Approximate confidence interval for p
We can show
^ − p
p
P ( − zα/2 < < zα/2 ) = 1 − α
√p(1 − p)/n
^(1 − p
p ^) ^(1 − p
p ^)
^ − zα/2 √
P (p ^ + zα/2 √
< p < p ) = 1 − α
n n
35
Problems
36