Chapter 5 Limit Theorems
Chapter 5 Limit Theorems
Chapter 5 Limit Theorems
Example Let X(n) denote the nth order statistic of a random sample X1, X2, ..., Xn
from a distribution having pdf
1
f (x) =
θ
Show that X(n) converges in probability to θ. Recall from last lect ure, we found
that for 0 < x < θ,
n
Fn(x) = P [X(n) ≤ x] = P [Xi ≤ x]
Y
i=1
n Zx 1 x !n
= du =
Y
i=1 0 θ θ
n
θ −
√ r √ r √ r
= P [( θ − X(n))( θ + X(n)) > ( θ + X(n))]
√
≤ P [θ − X(n) > θ]
which we know converges to 0 from previous Example.
5.3 Convergence in Distribution and the Central Limit Theorem
We have seen examples of random variables that are created by applying a function
to the observations of a random sample. The distribution of such a statistic often
depends on n, the size of the sample.
For example, let X̄ denote the sample mean of a random sample X1, .X2, ..., Xn
from the distribution N (µ, σ 2).
Let X(n) = max{X1, X2, ..., Xn}. X(n) is the nth order statistic in a sample of size
n. We have seen that the distribution function for X(n) is
Fn(x) = [F (x)]n
(In general) Let Xn denote a random variable whose distribution function Fn(x)
depends on n for n = 1, 2, 3, .... In many cases, we are interested in knowing if Fn(x)
converges to some fixed distribution function F (x).
An infinite sequence of real numbers a1, a2, a3, ... has limit a (converges to a), if for
any number > 0 there is an integer n0 such that for any n > n0
|a − an| <
lim a
n→∞ n
=a
For example, consider a sequence where an = 1 + n1 . Show that the limit of the
sequence is the number 1.
Let > 0 and select n0 to be the smallest integer greater than 1/. Then for any
n > n0 ,
1
|1 − an| = <
n
Using this notion, we say that the function F (y) is continuous at y if for any
sequence y1, y2, .. such that
lim y
n→∞ n
=y
we also have
Clearly Fn(x) = 0 for x < 0 and Fn(x) = 1 for θ ≤ x < ∞, and for 0 ≤ x < θ,
n
Fn(x) = P [X(n) ≤ x] = P [Xi ≤ x]
Y
i=1
n Zx 1 x !n
= du =
Y
i=1 0 θ n
We see that
A distribution that places all of its mass on a single point is called a degenerate
distribution.
Zn = n(θ − X(n)).
Because X(n) < θ with probability equal to 1, we can say that for z < 0, Fn(z) = 0
for all n, where Fn is the distribution function of Zn.
Also, because X(n) > 0 with probability 1, we can say that Fn(z) = 1 for z ≥ nθ.
Now let z take a value in [0, nθ).
z z
= P [X(n) > θ − ] = 1 − P [X(n) ≤ θ − ]
n n
n
θ − z/n z n
= 1 − = 1 − (1 − )
θ nθ
To investigate the limit of Fn(z), notice that for any z, z < nθ when n is taken
sufficiently large. Also, we use the fact that for a real number x,
x n x
lim
n→∞
(1 + ) = e .
n
Thus,
lim f (x)
n→∞ n
= 0.
We see that fn converges to the constant function f (x) = 0 which is not a pdf.
However,
F (x) = 1 for x ≥ 2
at all the continuity points of F (x) (all points other than x=2). F (x) is the cdf of a
random variable that equals 2 with probability 1.
Example 4: Let X1, X2, ..., Xn be a random sample from the distribution N (µ, σ 2).
Show that
n
Zn = Xi
X
i=1
Zn − nµ z − nµ z − nµ
= P[ √ ≤ √ ] = Φ( √ )
σ n σ n σ n
In all three cases, Fn converges to a constant value, and does not converge to a
distribution function.
Theorem: Let {Xn} and {Yn} be sequences of random variables such that Xn
converges in distribution to a random variable X, and Yn converges in probability
to a random variable c. Then, it follows that
a. Xn + Yn converges in distribution to X + c
We may use the notation: limn→∞M (t; n) = M (t) for Mn(t) → M (t).
cn
b ψ(n)
lim
n→∞
1 +
+
n n
lim ψ(n) = 0.
n→∞
lim
n→∞
1 +
+ lim 1 + = ebc
= n→∞
n n n
Example 5: Let Yn have a distribution that is b(n, pn) where pn = µ/n for some
µ > 0. The mgf of Yn is
t n µ(et − 1) n
= [(1 − µ/n) + µe /n] = [1 + ]
n
for all t. However, this is the moment generating function of a Poisson distribution
with mean µ.
This example indicates that we may use a Poisson distribution to approximate prob-
abilities of events concerning binomial random variables when p is small and n is
very large.
For example suppose that Y has distribution b(50, .04).
50 49
24 1 24
P [Y ≤ 1] = + 50 = 0.400.
25 25 25
However, we can say that Y has approximately the same distribution of a Poisson
random variable W that has mean µ = np = 50(.04) = 2. For example,
cbind(y,Fy,Fw)
y Fy Fw
[1,] 0 0.1298858 0.1353353
[2,] 1 0.4004812 0.4060058
[3,] 2 0.6767140 0.6766764
[4,] 3 0.8608692 0.8571235
[5,] 4 0.9510285 0.9473470
[6,] 5 0.9855896 0.9834364
[7,] 6 0.9963899 0.9954662
[8,] 7 0.9992186 0.9989033
Example 6: Let Zn be χ2(n). Then the mgf of Zn is (1 − 2t)−n/2 for t < 1/2. The
mean and variance of Zn are n and 2n, respectively.
The mgf of Yn is
Zn − n
M (t; n) = E exp t √
2n
√ √
−tn/ 2n tZn / 2n
=e E(e )
√ t
−tn/ 2n
=e (1 − 2 √ )−n/2
2n
√ √
−tn/ 2n
This holds for for t < n2. If we write e as
√ √
−tn/ 2n t 2/n −n/2
e = (e )
we see that
√ √ √ −n/2
t 2/n 2
M (t; n) =
e − t √ et 2/n
n
we see that
−n/2
2
t ψ(n)
1 −
M (t; n) = +
n n
where
√ 3 λ(n)
√ 3
2t e 2t 2t4eλ(n)
ψ(n) = √ − √ −
3 n n 3n
σ 2t2
M (t) = exp[µt + ]
2
The preceding example is really a special case of the central limit theorem. In
the next lecture, we will state and prove the central limit theorem. However, we’ll
see an empirical illustration of it here.
Example 7: Consider a random sample X1, X2, ..., Xn from a distribution with
pdf
3
f (x) = x2
2
for −1 < x < 1.
From this we find
3 2
Z x 1 x3
F (x) = −1 t dt = +
2 2 2
for −1 < x < 1.
Z 1 3x 2
E[X] = −1 2
x dx = 0
Z 1 3 4 3
V ar(X) = −1 2
x dx =
5
1 x3
u = F (x) = +
2 2
500
400
300
200
100
0
z
Figure 2: Histogram of Zn for n = 5
500
400
300
200
100
0
-3 -2 -1 0 1 2 3
z
Figure 3: Histogram of Zn for n = 25
800
600
400
200
0
-4 -2 0 2 4