Asymptotic Variance
Asymptotic Variance
Asymptotic Variance
Asymptotic variance
Suppose X1 , . . . , Xn are iid from some distribution Fθo with density fθo . We observe data x1 , . . . , xn .
The Likelihood is:
Y
n
L(θ) = fθ (xi )
i=1
X
n
l(θ) = log[fθ (xi )]
i=1
The maximum likelhood estimator maximizes l(·), and is an estimator for θo . Under some
regularity conditions of fθ , for n large
1
θ̂ ≈ N (θo , )
nI(θo )
where I(θ) is called the information, and is defined as
∂logfθ (X) 2
I(θ) = E( )
∂θ
Notice that X is capitalized above. It denotes that the expectation is beinf taken with respect
to X and its distribution. Also note that the derivative is with repect to θ.
It is often that case that the information also can be expressed as
∂ 2 logfθ (X)
I(θ) = −E( )
∂θ2
For θ̃ any unbiased estmator for θo , we have a lower bound on the variance of θ̃
1
V ar(θ̃) ≥
nI(θo )
1
Example: X1 , . . . , Xn iid with density fσ (x) = 2σ exp(−|x|/σ).
Method of Moments
1. E(X) = 0 by symmetry
R∞
E(X 2 ) = 2
−∞ x fσ (x)dx =
2. σ =
3. σ̂ =
Method of Moments
1. Likelihood function: L(σ) =
∂logl(σ)
=
∂σ
Asymptotic Variance of σ̃
∂logfσ (x)
∂σ =
∂ 2 logfσ (x)
∂σ 2
=
2 logf (x)
I(σ) = −E[ ∂ σ
∂σ 2
]=
Method of Moments
R1
1. E(X) = 0 (θ + 1)xθ+1 dx
2. θ =
3. θ̂ =
Method of Moments
1. Likelihood function: L(θ) =
∂logl(θ)
=
∂θ
Asymptotic Variance of θ̃
∂logfθ (x)
∂θ =
∂ 2 logfθ (x)
∂θ2
=
2 logf (x)
I(θ) = −E[ ∂ ∂θ2
θ
]=