General Concepts of Point Estimation
General Concepts of Point Estimation
6.1
Chapter 6 A point estimator of a parameter θ is a A point estimator θˆ is said to be an
single number that can be regarded as a unbiased estimator of θ if E (θˆ) = θ
General Concepts sensible value for θ . A point estimator for every possible value of θ . If θˆ is
Point Estimation of can be obtained by selecting a suitable biased, the difference E (θˆ) − θ is called
statistic and computing its value from the the bias of θˆ .
Point Estimation given sample data.
Unbiased Estimator Unbiased Estimator Principle of Minimum Variance Graphs of the pdf’s of two
Unbiased Estimation different unbiased estimators
Let X1, X2,…,Xn be a random sample If X1, X2,…,Xn is a random sample from a
from a distribution with mean µ and distribution with mean µ , then X is an Among all estimators of θ that are pdf of θˆ1
varianceσ 2. Then the estimator unbiased estimator of µ . If in addition unbiased, choose the one that has the
(X − X) minimum variance. The resulting θˆ is
2
the distribution is continuous and
i
symmetric, then the sample median and called the minimum variance unbiased pdf of θˆ2
σˆ = S =
2 2
MVUE for a Normal Distribution A biased estimator that is preferable Standard Error
to the MVUE 6.2
Let X1, X2,…,Xn be a random sample The standard error of an estimator θˆ is
pdf of θˆ1 (biased)
from a normal distribution with its standard deviation σ θˆ = V (θˆ) . If Methods of
parameters µ and σ . Then the pdf of θˆ2 the standard error itself involves
estimator µˆ = X is the MVUE for µ .
(the MVUE) unknown parameters whose values can Point Estimation
be estimated, substitution into σ θˆ yields
the estimated standard error of the
θ estimator, denoted σˆθˆ or sθˆ .
Moment Estimators Likelihood Function Maximum Likelihood Estimators
Moments Let X1, X2,…,Xn be a random sample
Let X1, X2,…,Xn have joint pmf or pdf The maximum likelihood estimates
from a distribution with pmf or pdf
Let X1, X2,…,Xn be a random sample f ( x1 ,..., xn ;θ1 ,...,θ m ) (mle’s) θˆ1 ,...,θˆm are those values of the
f ( x;θ1 ,...,θ m ), where θ1 ,...,θ m
from a pmf or pdf f (x). For k = 1, 2,… are parameters whose values are where parameters θ1 ,...,θ m θi'
s that maximize the likelihood
the kth population moment, or kth unknown. Then the moment estimators have unknown values. When x1,…,xn are function so that
k
moment of the distribution f (x) is E ( X ). f ( x1 ,..., xn ;θˆ1 ,...,θˆm ) ≥ f ( x1 ,..., xn ;θ1 ,...,θ m )
θ1 ,...,θ m are obtained by equating the the observed sample values and f is
The kth sample moment is first m sample moments to the regarded as a function of θ1 ,...,θ m ,it is for all θ1 ,...,θ m
1 n corresponding first m population called the likelihood function. When the Xi’s are substituted in the place of
X k.
n i =1 i moments and solving for θ1 ,...,θ m . the xi’s, the maximum likelihood estimators
result.