0% found this document useful (0 votes)
21 views

Solutions de Poisson

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

Solutions de Poisson

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Stat 513

HW 5 Solutions
February 26, 2004

• 8.3.3
iid
Let X1 , ..., Xn ∼ N (µ, σ 2 ).

(a) Find a UMVUE for µ if σ 2 is known, and show that this estimator is CAN.
Since σ 2 is known, we have xi being a complete sufficient statistic for µ (as seen in class
P

examples, previous homework or by writing out the likelihood in exponential family form).
Thus, taking X as our estimator of µ, we have one that is unbiased, and based on a complete
sufficient statistic and thus, by Theorem 8.3.8 (Lehman-Scheffe), is UMVUE for µ.
Now to show it is CAN (also done before), we can just appeal to the Lindeberg-Lévy CLT
and know that: √
n(X − µ) → N (0, σ 2 )
d

(b) Find a UMVUE for σ 2 if µ is known, and show that this estimator is CAN.
When µ is known, we can write the likelihood as follows:
→ n 1 X
f ( x |σ 2 ) = exp{− ln(2πσ 2 ) − 2 (xi − µ)2 }
2 2σ
Thus, (xi − µ)2 is complete and sufficient for σ 2 since we have the additional requirement
P

of Θ = (0, ∞) containing a 1-dim rectangle.


Now noting that E( (xi − µ)2 ) = nσ 2 , we now know by the Lehman-Scheffe Theorem again
P

that T (X ) = n1 (xi − µ)2 is UMVUE for σ 2 (when µ is known).
P

Now for the CAN part, again by the Lindeberg-Levy CLT (and previous homework) we
know that √ ˆ2
n(σ − σ 2 ) → N (0, 2σ 4 )
d

• 8.3.6
iid
Let X1 , ..., Xn ∼ Γ(α − 1, β, 0).

(a) Find a BUE for αβ.


A good place to start is finding a comnplete sufficient statistic which is usually easy to find
via exponential family form. In the context of this problem:
→ Y  1X
f ( x |α, β) = exp{−n ln(β α Γ(α)) + (α − 1)ln xi − xi }
β
From here we can see that ( xi , xi ) is jointly complete and sufficient for (α, β) since we
Q P

have the proper exponential family form and Θ = (0, ∞) x (0, ∞) contains a two-dimensional

1
rectangle. Conveniently, E(Xi ) = αβ (using the book’s parameterization), which means
that X is unbiased for αβ, and this is a function of our joint-complete sufficient statistics
(specifically 0 · xi + n1 · xi . Now appealing to the Theorem by Lehman-Scheffe again,
Q P

we have T (X ) = X is UMVUE (hence BUE) for αβ.

(b) Show estimator from part (a) is CAN.


Lindebeger-Levy CLT here too:

n(X − αβ) → N (0, αβ 2 )
d

• 8.3.7
iid
Let X1 , ..., Xn ∼ E(θ) = θe−θx 1[0,∞) (x). Show (n − 1)/ Xi is the unique MVUE.
P

n
θe−θxi 1[0,∞) (xi )
Y
fX (x) =
i=1
P
= e(θ xi −n log θ)
1[0,∞) (min(xi ))

This is exponential family form, with h(x) = 1[0,∞) (min(xi )), T (X) = xi , τ (θ) = θ and
P

A(θ) = n log θ. Θ=(the set of all possible θs) = (0, ∞). This contains a 1-d open rectangle (e.g.
(0,1)). Therefore we can apply our theorem about exponential families and complete sufficient
statistics. xi is the complete sufficient statistic for θ. Now (n − 1)/ Xi is a function of this
P P

complete sufficient statistic, so if it has expectation θ, it must be the UMVUE. Recall that the
sum of n exponential(θ)s is Gamma(n,θ). Let Y = Xi ∼ Gamma(n, θ)
P

n−1 n − 1 y (n−1) θ n −θy



  Z
E = · e dy
Y 0 y Γ(n)
Z ∞ (n−2) (n−1)
y θ
= θ e−θy dy
0 Γ(n − 1)
= θ

Since the form in the integral is the density of a Gamma(n-1,θ). (Here I used the rate parama-
terization of a Gamma).

• 8.3.9
In problem 8.2.15, find a BUE of µ/(1 − e−µ ) and 1 − e−µ .

2
iid
From 8.2.15: Let X1 , ..., Xn ∼ truncated Poisson where p(x|µ) = e−µ µx /(x!(1 − e−µ ))1{1,2,...} (x).
Also from 8.2.15, we know that xi is complete and sufficient for µ under this distribution.
P

Now noting that E(X) = µ/(1 − e−µ ), it follows immediately from Lehman-Scheffe that X is
UMVUE for µ/(1 − e−µ ).
Now for (1 − e−µ ). First find an unbiased estimator based on only one observation.

E(t(X1 )) = (1 − e−µ )

t(k)µk e−µ
= (1 − e−µ )
X
k!(1 − e −µ )
k=1

t(k)µk
= eµ − 2 + e − µ
X

k=1 k!
∞ ∞ ∞
X t(k)µk X µk X (−µ)k
= +
k=1 k! 1 k! 1 k!
∞ ∞
t(k)µk µk
(1 + (−1)k )
X X
=
k=1 k! 1 k!
t(X1 ) = 1 + (−1)X1

Use the refinement theorem to obtain the BUE of 1 − e−µ : T (X) = E(T (X1 )| Xi ). The sum
P

of Xi do not have a particularly nice distribution, so we leave it at this.


• 8.3.11
Let X be a single observation from a Poisson distribution with unknown parameter µ.

(a) Find a BUE for µ2 . [Hint: Note that E(X(X − 1)) = µ2 , work it out using V ar(X) =
E(X 2 ) − (E(X))2 if you don’t see it immediately]
First note that when there’s just a single observation, the complete sufficient statistic for µ
in a Poisson distribution is just x itself. Showing this via exponential family form:

f ( x |µ) = f (x|µ) = exp{x ln(µ) − µ − ln(x!)}
and we note Θ = (0, ∞) contains the necessary 1-dim rectangle. Now, in concordance with

the hint, we know T (X ) = T (X) = x(x − 1) is UMVUE (via Lehman-Scheffe) for µ2

(b) If we have a random sample of size n, find the BUE for µ2 .


For a sample of size n, our complete sufficient statistic for µ is Sn = xi , and we know
P

the distribution of this statistic will follow P(nµ). Now in similar form to part (a), we can

see E(Sn (Sn − 1)) = n2 µ2 so if we take T (X ) = n12 Sn (Sn − 1) then we have an unbiased
estimator of µ2 based on the complete sufficient statistic for µ and will thus be UMVUE
(BUE) by Lehman-Scheffe.

3
(c) Based on a sample of size 1, find the BUE for µr , r > 1. [Hint: Compute E(X(X − 1)(X −
2)...(X − r + 1))]
Back to the situation as in part (a) with just x being complete and sufficient. Now, as
suggested by the hint, if we calculate the expceted value there (very similar to hmwrk 3 in
512) we’ll see that expression is unbiased for µr and we have the BUE since it’s a function
of our complete sufficient statistic.

• 9.2.1
An urn has 10 balls with θ of them blue (the rest being whatever color you want, other than
blue). We’re in the following hypothesis testing situation:
H0 : θ = 3 vs. H1 : θ = 4
Suppose a sample consists of 3 balls, and the decision rule is to reject H0 if all 3 balls in the
sample are blue. Compute the errors α (Type I) and β (Type II) when:

(a) Sampling is done without replacement.

α = P ( all three balls are blue | θ = 3)


θ θ−1 θ−2 θ(θ − 1)(θ − 2) 1
= · · = =
10 9 8 720 120

β = 1 − P ( all three balls are blue | θ = 4)


1 29
= 1− =
30 30

(b) Sampling is done with replacement.

α = P ( all three balls are blue | θ = 3)


!3
θ 27
= =
10 1000

β = 1 − P ( all three balls are blue | θ = 4)


64 936
= 1− =
1000 1000

• 9.2.5
Suppose we have a random sample of size 5 from a Poisson distribution with mean λ ∈ {2, 3}.
We want to test the following:
H0 : λ = 3 vs. H1 : λ = 2

4
with rejection when X < c. Find the critical region to use for this test if α is set at 0.05 (or as
close to 0.05 as possible using a Poisson Table).
The appropriate critical region will be such that
5
X
P r(X < c|λ = 3) = P r( xi < 5c|λ = 3)
i=1

Now since we know that the sum of n independent poisson r.v.’s with common support will be
distributed as poisson with nλ, then we know what the distribution of our test statistic xi is.
P

With respect to the poisson table on page 770, we can see that a value of 9 will keep the test at
a level 0.05. This implies the value of c we’re looking for is 9/5 = 1.8.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy