CH 09
CH 09
CH 09
Walter R. Paczkowski
Rutgers University
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 1
Stationary Variables
Chapter Contents
9.1 Introduction
9.2 Finite Distributed Lags
9.3 Serial Correlation
9.4 Other Tests for Serially Correlated Errors
9.5 Estimation with Serially Correlated Errors
9.6 Autoregressive Distributed Lag Models
9.7 Forecasting
9.8 Multiplier Analysis
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship:
1. Specify that a dependent variable y is a
function of current and past values of an
explanatory variable x
Eq. 9.1 yt f ( xt , xt 1 , xt 2 ,...)
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship (Continued):
2. Capturing the dynamic characteristics of time-series
by specifying a model with a lagged dependent
variable as one of the explanatory variables
Eq. 9.2 yt f ( yt 1 , xt )
• Or have:
Eq. 9.3 yt f ( yt 1 , xt , xt 1 , xt 2 )
– Such models are called autoregressive
distributed lag (ARDL) models, with
‘‘autoregressive’’ meaning a regression of yt on
its own lag or lags
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 8
Stationary Variables
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship (Continued):
3. Model the continuing impact of change over
several periods via the error term
9.1.2
Least Squares
Assumptions
The primary assumption is Assumption MR4:
9.1.2a
Stationarity
9.1.2a
Stationarity
9.1.2a
Stationarity
9.1.2a
Stationarity
9.1.3
Alternative Paths
Through the
Chapter
9.1.3
Alternative Paths
Through the
Chapter
Eq. 9.6 yT 1 0 xT 1 1 xT 2 xT 1 q xT q 1 eT 1
– Policy analysis
• What is the effect of a change in x on y?
E ( yt ) E ( yt s )
Eq. 9.7 s
xt s xt
9.2.1
Assumptions
TSMR1. yt β 0 xt β1 xt 1 β 2 xt 2 β q xt q et , t q 1, , T
TSMR2. y and x are stationary random variables, and et is independent of
current, past and future values of x.
TSMR3. E(et) = 0
TSMR4. var(et) = σ2
TSMR5. cov(et, es) = 0 t ≠ s
TSMR6. et ~ N(0, σ2)
9.2.2
An Example:
Okun’s Law
Consider Okun’s Law
– In this model the change in the unemployment rate
from one period to the next depends on the rate of
growth of output in the economy:
Eq. 9.8 U t U t 1 Gt GN
9.2.2
An Example:
Okun’s Law
GDPt GDPt 1
Eq. 9.11 Gt 100
GDPt 1
9.2.2
An Example:
Okun’s Law
9.2.2
An Example:
Okun’s Law
9.2.2
An Example:
Okun’s Law
9.2.2
An Example:
Okun’s Law
9.3.1
Serial Correlation
in Output Growth
9.3.1a
Computing
Autocorrelation
9.3.1a
Computing For the Okun’s Law problem, we have:
Autocorrelation
cov Gt , Gt 1 cov Gt , Gt 1
ρ1
var Gt
Eq. 9.12
var Gt var Gt 1
9.3.1a
Computing
Autocorrelation
T 1 t 1
9.3.1a
Computing
Autocorrelation
G G Gt t 1 G
Eq. 9.13 r1 t 2
T
G G
2
t
t 1
9.3.1a
Computing
Autocorrelation
t 1
9.3.1a
Computing
Autocorrelation
T t 1
9.3.1a
Computing
Autocorrelation
9.3.1a
Computing
Autocorrelation
9.3.1a
Computing
Autocorrelation
For our problem, we have:
9.3.1b
The Correlagram
9.3.1b
The Correlagram
9.3.2
Serially Correlated
Errors
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve
9.3.2a
A Phillips Curve The k-th order autocorrelation for the residuals can
be written as: T
eˆ eˆ t t k
Eq. 9.21 rk t k 1
T
t
ˆ
e 2
t 1
– The least squares equation is:
INF 0.7776 0.5279 DU
Eq. 9.22
se 0.0658 0.2294
9.3.2a
A Phillips Curve
9.4.1
A Lagrange
Multiplier Test
9.4.1
A Lagrange
Multiplier Test
9.4.1
A Lagrange
Multiplier Test
9.4.1
A Lagrange
Multiplier Test
9.4.1
A Lagrange
Multiplier Test
To derive the relevant auxiliary regression for the
autocorrelation LM test, we write the test equation
as:
Eq. 9.25 yt β1 β 2 xt ρeˆt 1 vt
b1 b2 xt eˆt β1 β 2 xt ρeˆt 1 vt
9.4.1
A Lagrange
Multiplier Test
Rearranging, we get:
eˆt β1 b1 β 2 b2 xt ρeˆt 1 vt
Eq. 9.26
γ1 γ 2 xt ρeˆt 1 v
– If H0: ρ = 0 is true, then LM = T x R2 has an
approximate χ2(1) distribution
• T and R2 are the sample size and goodness-
of-fit statistic, respectively, from least
squares estimation of Eq. 9.26
9.4.1
A Lagrange
Multiplier Test
Considering the two alternative ways to handle ê0 :
iii LM T 1 R 2
89 0.3102 27.61
iv LM T R 2 90 0.3066 27.59
– These values are much larger than 3.84, which
is the 5% critical value from a χ2(1)-distribution
• We reject the null hypothesis of no
autocorrelation
– Alternatively, we can reject H0 by examining
the p-value for LM = 27.61, which is 0.000
9.4.1a
Testing Correlation
at Longer Lags
9.4.2
The Durbin-
Watson Test
9.5.1
Least Squares
Estimation
Suppose we proceed with least squares estimation
without recognizing the existence of serially
correlated errors. What are the consequences?
1. The least squares estimator is still a linear
unbiased estimator, but it is no longer best
2. The formulas for the standard errors usually
computed for the least squares estimator are
no longer correct
• Confidence intervals and hypothesis tests
that use these standard errors may be
misleading
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 65
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
9.5.1
Least Squares
Estimation
Consider the model yt = β1 + β2xt + et
– The variance of b2 is:
where
wt xt x x x
2
t t
9.5.1
Least Squares
Estimation
When the errors are not correlated, cov(et, es) = 0,
and the term in square brackets is equal to one.
– The resulting expression
9.5.1
Least Squares
Estimation
Eq. 9.28
varHAC b2
varHC b2 g
ˆ
9.5.1
Least Squares
Estimation
INF 0.7776 0.5279 DU
Eq. 9.29 0.0658 0.2294 incorrect se
0.1030 0.3127 HAC se
9.5.1
Least Squares
Estimation
9.5.2
Estimating an
AR(1) Error Model
9.5.2
Estimating an
AR(1) Error Model
Eq. 9.30 describes a first-order autoregressive
model or a first-order autoregressive process for
et
– The term AR(1) model is used as an
abbreviation for first-order autoregressive
model
– It is called an autoregressive model because it
can be viewed as a regression model
– It is called first-order because the right-hand-
side variable is et lagged one period
9.5.2a
Properties of an
AR(1) Error
We assume that:
Eq. 9.32 1 ρ 1
The mean and variance of et are:
2
Eq. 9.33 E et 0 var et 2
e
v
1 ρ2
The covariance term is:
ρ k 2
Eq. 9.34 cov et , et k , k 0 v
1 ρ 2
9.5.2a
Properties of an The correlation implied by the covariance is:
ρ k corr et , et k
AR(1) Error
cov et , et k
var et var et k
cov et , et k
Eq. 9.35
var et
ρ k v2 1 ρ 2
v2 1 ρ 2
ρk
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 75
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors
9.5.2a
Properties of an Setting k = 1:
AR(1) Error
9.5.2a
Properties of an
AR(1) Error
Each et depends on all past values of the errors vt:
9.5.2a
Properties of an
AR(1) Error
ρˆ 2 ρˆ 0.549 0.301
2 2
ρˆ 3 ρˆ 3 0.549 0.165
3
ρˆ 4 ρˆ 0.549 0.091
4 4
ρˆ 5 ρˆ 5 0.549 0.050
5
9.5.2b
Nonlinear Least
Squares Estimation
9.5.2b
Nonlinear Least
Squares Estimation
With the appropriate substitutions, we get:
Eq. 9.41 et 1 yt 1 β1 β 2 xt 1
– Multiplying by ρ:
9.5.2b
Nonlinear Least
Squares Estimation
Substituting, we get:
Eq. 9.43 yt β1 1 ρ β 2 xt ρyt 1 ρβ 2 xt 1 vt
9.5.2b
Nonlinear Least
Squares Estimation
9.5.2b
Nonlinear Least
Squares Estimation
Our Phillips Curve model assuming AR(1) errors
is:
Eq. 9.44 INFt β1 1 ρ β 2 DU t ρINFt 1 ρβ 2 DU t 1 vt
INF 0.7609 0.6944 DU et 0.557et 1 vt
Eq. 9.45
se 0.1245 0.2479 0.090
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 83
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors
9.5.2c
Generalized Least
Squares Estimation
9.5.3
Estimating a More
General Model
We have the model:
Eq. 9.47 yt δ θ1 yt 1 δ0 xt δ1 xt 1 vt
9.5.3
Estimating a More
General Model
9.5.3
Estimating a More
General Model
9.5.3
Estimating a More
General Model
9.5.3
Estimating a More
General Model
9.5.3
Estimating a More
General Model
9.5.4
Summary of
Section 9.5 and
We have described three ways of overcoming the
Looking Ahead
effect of serially correlated errors:
1. Estimate the model using least squares with
HAC standard errors
2. Use nonlinear least squares to estimate the
model with a lagged x, a lagged y, and the
restriction implied by an AR(1) error
specification
3. Use least squares to estimate the model with a
lagged x and a lagged y, but without the
restriction implied by an AR(1) error
specification
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 91
Stationary Variables
9.6
Autoregressive Distributed Lag
Models
Eq. 9.52 yt 0 xt 1 xt 1 q xt q 1 yt 1 p yt p vt
– Two examples:
ADRL 1, 0 : INF
t 0.3548 0.5282 INFt 1 0.4909 DU t
Eq. 9.53
β s xt s et
s 0
9.6.1
The Phillips Curve
9.6.1
The Phillips Curve
9.6.1
The Phillips Curve
9.6.1
The Phillips Curve
Eq. 9.57
se 0.0983 0.1016 0.1038 0.1050
0.2819INFt -4 0.7902DUt
0.1014 0.1885 obs 87
9.6.1
The Phillips Curve
9.6.1
The Phillips Curve
9.6.2
Okun’s Law
9.6.2
Okun’s Law
9.6.2
Okun’s Law
9.6.2
Okun’s Law
9.6.3
Autoregressive
Models
Eq. 9.60 yt δ θ1 yt 1 θ2 yt 2 θ p yt p vt
9.6.3
Autoregressive
Models
9.6.3
Autoregressive
Models
9.7.1
Forecasting with
an AR Model
GT 1 δ θ1GT θ2GT 1 vT 1
9.7.1
Forecasting with
an AR Model
9.7.1
Forecasting with
an AR Model
For two quarters ahead, the forecast for G2010Q1 is:
9.7.1
Forecasting with
an AR Model
9.7.1
Forecasting with
an AR Model
GˆT j t 0.975,df σ̂ j
where σ̂ j is the standard error of the forecast
error and df is the number of degrees of freedom
in the estimation of the AR model
9.7.1
Forecasting with
an AR Model
u1 GT 1 GˆT 1 δ δˆ θ1 θˆ1 GT θ2 θˆ 2 GT 1 vT 1
Ignoring the error from estimating the coefficients,
we get:
Eq. 9.66 u1 vT 1
9.7.1
Forecasting with
an AR Model
Eq. 9.67
u2 θ1 GT 1 GˆT 1 vT 2 θ1u1 vT 2 θ1vT 1 vT 2
Eq. 9.68
u3 θ1u2 θ 2u1 vT 3 θ12 θ 2 vT 1 θ1vT 2 vT 3
9.7.1
Forecasting with
an AR Model
9.7.1
Forecasting with
an AR Model
9.7.2
Forecasting with
an ARDL Model
9.7.2
Forecasting with
an ARDL Model
U T 1 UT δ θ1 UT UT 1 δ0GT 1 δ1GT vT 1
– Rearranging:
9.7.2
Forecasting with
an ARDL Model
9.7.3
Exponential
Smoothing
9.7.3
Exponential
Smoothing
yT yT 1 yT 2
yˆT 1
3
– This forecasting rule is an example of a simple
(equally-weighted) moving average model with
k=3
9.7.3
Exponential
Smoothing
yˆT 1 αyT α 1 α yT 1 α 1 α yT 2
1 2
Eq. 9.72
9.7.3
Exponential
Smoothing
1 α yˆT α 1 α yT 1 α 1 α yT 2 α 1 α yT 3
2 3
Eq. 9.73
9.7.3
Exponential
Smoothing
The value of α can reflect one’s judgment about
the relative weight of current information
– It can be estimated from historical information
by obtaining within-sample forecasts:
9.7.3
Exponential
Smoothing
9.7.3
Exponential
Smoothing
9.7.3
Exponential
Smoothing
Eq. 9.77 yt 1 yt 1 p yt p 0 xt 1 xt 1 q xt q vt
Eq. 9.78 yt α β 0 x t + β1 xt 1 β 2 xt 2 β 3 xt 3 et
β
j 0
j s period interim multiplier
β
j 0
j total multiplier
yt 1 Lyt 2 L2 yt p Lp yt 0 xt 1Lxt 2 L2 xt
Eq. 9.79
q Lq xt vt
Rearranging terms:
Eq. 9.80 1 2
1 L L2
p Lp
t
y 0 1 2
L L2
q Lq
xt vt
α= 1 θ1L δ
1
Eq. 9.85
et 1 θ1L vt
1
Eq. 9.87
δ0 δ1L 1 θ1L β0 β1L β2 L2 β3L3
β0 β1L β2 L2 β3L3
Eq. 9.88
β0θ1L β1θ1L2 β2θ1L3
β0 β1 β0θ1 L β2 β1θ1 L2 β3 β2θ1 L3
δ0 = β0
δ1 β1 β0θ1
0 β2 β1θ1
0 β3 β2θ1
and so on
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 146
Stationary Variables
9.8
Multiplier Analysis
δ0 δ1L δ2 L2 δq Lq 1 θ1L θ2 L2 θ p Lp
Eq. 9.91
β0 β1L β2 L2 β3L3
– Given the values p and q for your ARDL
model, you need to multiply out the above
expression, and then equate coefficients of like
powers in the lag operator
β
j 0
j
Eq. 9A.1 d t 2
T
t
ˆ
e 2
t 1
d t 2 t 2
T
t 2
t
ˆ
e 2
t 1
T T T
Eq. 9A.2
eˆ 2
t eˆ 2
t 1 eˆt eˆt 1
t 2
T
t 2
T
2 t 2
T
t
ˆ
e 2
t
ˆ
e 2
t
ˆ
e 2
t 1 t 1 t 1
1 1 2r1
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 157
Stationary Variables
9A
The Durbin-
Watson Test
Eq. 9A.3 d 2 1 r1
– If the estimated value of ρ is r1 = 0, then the
Durbin-Watson statistic d ≈ 2
• This is taken as an indication that the model
errors are not autocorrelated
– If the estimate of ρ happened to be r1 = 1 then d ≈
0
• A low value for the Durbin-Watson statistic
implies that the model errors are correlated, and
ρ>0
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 158
Stationary Variables
9A
The Durbin-
Watson Test FIGURE 9A.1 Testing for positive autocorrelation
9A.1
The Durbin-
Watson Bounds
Test
9A.1
The Durbin-
Watson Bounds
Test
Note that:
et ρet 1 vt
ρ ρet 2 vt 1 vt
Eq. 9B.1
ρ 2 et 2 ρvt 1 vt
Further substitution shows that:
et ρ ρet 3 vt 2 ρvt 1 vt
2
Eq. 9B.2
ρ3et 3 ρ 2 vt 2 ρvt 1 vt
v2
1 ρ2
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 164
Stationary Variables
9B
Properties of the
AR(1) Error
v ρv ρ v ρ v
t 1 t 2
2
t 3
3
t 4
ρE v ρ E v ρ E v
2
t 1
3 2
t 2
5 2
t 3
ρ 1 ρ ρ
2
v
2 4
ρv2
1 ρ2
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 165
Stationary Variables
9B
Properties of the
AR(1) Error
ρk v2
cov et , et k k 0
1 ρ2
yt 1 2 xt et et et 1 vt
Eq. 9C.4 yt 1 2 xt ( yt 1 1 2 xt 1 ) vt
1 2 y1 1 2 1 1 2 x1 2 1 2 e1
Or:
where
y1 1 2 y1 x11 1 2
Eq. 9C.6
x12 1 2 x1 e1 1 2 e1