Lecture 2
Lecture 2
Katerina Petrova
Lag operators
De…nition
The lag operator is de…ned as Lyt = yt 1, so that for any K 2 N
LK yt = LK 1
yt 1 = LK 2
yt 2 = ... = yt K.
De…nition
The di¤erence operator is de…ned as ∆yt = (1 L) yt = yt yt 1
and
∆k yt = (1 L)k yt
∆k yt = 1 Lk yt = yt yt k.
Lag polynomials
De…nition
We de…ne a lag polynomial of order p as
α (L) = α0 + α1 L + ... + αp Lp .
De…nition
An Autoregressive Moving Average model of order p, q, (denoted
by Xt ARMA(p, q)) is de…ned recursively as
Xt = µ + ∑pi=1 φi Xt i + ∑qj=0 θ j εt j ,
= µ + φ1 Xt 1 + ... + φp Xt p + εt + θ 1 εt 1 + ... + θ q εt q
φ (L) Xt = µ + θ (L) εt
De…nition
A Moving Average model of order q, (denoted by Xt MA(q)) is
just a special case of ARMA(0, q ):
Xt = µ + ∑qj=0 θ j εt j = µ + εt + θ 1 εt 1 + ... + θ q εt q
Xt = µ + θ (L) εt
De…nition
An autoregressive model of order p, (denoted by Xt AR(p)) is
another special case of ARMA(p, 0):
Xt = µ + ∑pi=1 φi Xt i + εt
= µ + φ1 Xt 1 + ... + φp Xt p + εt
φ (L) Xt = µ + εt
= (1 + θ 2 ) σ 2
γ1t = E[Xt E(Xt )][Xt 1 E(Xt 1 )]
= E[εt + θεt 1 ][εt 1 + θεt 2 ] = θσ2
for j > 1, γjt = E[Xt E(Xt )][Xt j E(Xt j )]
= E[εt + θεt 1 ][εt j + θεt j 1 ] = 0
= σ2 ∑qk =j0 θ k θ k +j
De…nition
An MA(∞) process Xt is de…ned as the (L2 or a.s.) limit as
n ! ∞ of Xn,t = µ + ∑nj=0 θ j εt j with εt WN (0, σ2 ):
∞
Xt = µ + ∑j =0 θ j εt j
De…nition
Xt is a linear process if it can be written as
= σ 2 ∑M 2
j = N + 1 cj = σ
2
∑M 2
j = 0 cj ∑N 2
j = 0 cj
n o
j = 0 cj : N 2 N
CN := ∑N
Since the sequence 2 converges, it is
Cauchy in R, so that ∑M 2
j = 0 cj ∑N 2
j =0 cj can be made arbitrarily
small for all M > N and a suitable choice of N.
We conclude that 9N 2 N s.t. 8M > N, kXM ,t XN ,t k2L2 can be
made smaller than ε. Therefore the MA(∞) process Xn,t converges
in mean square (by completeness of L2 ) as n ! ∞.
Proof.
It remains to show that the limit Xt is in L2 and to check if it is
covariance stationary.
To show that square summability of the coe¢ cients implies
covariance stationarity of the MA(∞) process (and Xt is in L2 ), we
can compute the mean and covariances from the MA(q) process,
letting q ! ∞.
E[Xt ]= µ
γjt = σ2 ∑k∞=0 ck ck +j ,
De…nition
For any sequence (cj )j 0, we say that (cj )j 0 are square summable
if ∑j∞=0 cj2 < ∞.
Theorem
For any sequence (cj )j 0 , absolute summability ∑j∞=0 jcj j < ∞
implies square summability ∑j∞=0 cj2 < ∞ but the converse does not
hold.
Proof.
First we show that ∑j∞=0 jcj j < ∞ implies that jcj j ! 0 as j ! ∞.
To see this, if jcj j 6! 0, there exists ε > 0 such that jcj j ε for
in…nitely many j, say for all j 2 J, where J is an in…nite set:
jJ j = ∞. Then
∑j∞=0 jcj j ∑ j 2 J j cj j ε ∑j 2J 1 = ε jJ j = ∞.
Square and Absolute Summability
Proof.
Now ∑j∞=0 jcj j < ∞ =) jcj j ! 0 as j ! ∞ =) jcj j < 1 for all
but …nitely many j =) cj2 < jcj j for all but …nitely many j.
n o
Denote by JF = j : cj2 jcj j and by JFc its complement. Note
that jcj j ! 0 implies that JF is …nite set and JFc is in…nite. Then
∑j∞=0 cj2 = ∑j 2JF cj2 + ∑j 2JFc cj2 ∑j 2JF cj2 + ∑j 2JFc jcj j
∑j 2JF cj2 + ∑j∞=0 jcj j < ∞
since ∑j∞=0 jcj j < ∞ and ∑j 2JF cj2 is a sum of …nitely many terms
(JF many).
Square and Absolute Summability
Proof.
We show the converse does not hold with a counterexample. Let
cj = j 1 . Then ∑j∞=1 cj2 = π 2 /6 whereas
1 RN1
∑N N
j = 1 j cj j = ∑ j = 1 1 t
dt (N ! ∞)
j
= log N ! ∞
so ∑j∞=1 jcj j = ∞.
Example (AR(1))
Let Xt be an AR(1) process: Xt = c + ρXt 1 + εt
= c + εt + ρ(c + εt 1 ) + ρ2 Xt 2
= c + εt + ρ (c + εt 1 ) + ρ2 (c + εt 2) + ρ
3
Xt 3
= ... = c ∑j∞=0 ρj + ∑j∞=0 ρj εt j (2)
c c
E[Xt ] = + E[εt ] + ρE[εt 1] + ρ
2
E[ εt 2 ] + ... =
1 ρ 1 ρ
γ0 = E[Xt E(Xt )] = E[εt + ρεt
2
+ ρ εt 2 + ...] 1
2 2
= σ (1 + ρ + ρ + ρ + ...) = σ /(1 ρ2 )
2 2 4 6 2
Example (AR(1))
= E[εt + ρεt 1 + ρ εt 2
2
j
+ ... + ρ εt j +ρ j +1
εt j 1 + ρ j +2 ε t j 2 + ...]
2
[ εt j + ρεt j 1 + ρ εt j 2 + ...]
2 j j +2 j +4 2 j
= σ (ρ + ρ +ρ + ...) = σ ρ /(1 ρ2 )
The autocorrelation
rj = γj /γ0 = ρj .
How would the autocorrelation look like as a function of the lags j?
AR (1) with lag operator
so that
1
Xt = (1 ρL) εt .
Recall that (1 r ) 1 , as long as jr j < 1, can be written as
geometric series 1 + r + r 2 + ... = ∑j∞=0 r j . Therefore
Xt = (1 ρL) 1
εt = ∑j∞=0 ρj Lj εt = ∑j∞=0 ρj εt j ,