vertopal.com_econometrics_module_4_lesson_3
vertopal.com_econometrics_module_4_lesson_3
FINANCIAL ECONOMETRICS
MODULE 4 LESSON 3
AUTOREGRESSIVE MODEL
@>
p(- 2) * >
p(- 2) * @
Reading Time & 90 minutes
Prior Knowledge & Basic Time Series knowledge, White noise, Random walk
Keywords & Autoregressive AR(p) process or model, Autoregressive operator, Mean,
Variance, Autocovariance and Autocorrelations for AR(1), Invertible MA, Method of
Moments, First order moment, second order moment, Yule-Walker estimation, Maximum
Likelihood estimation (MLE)
In the last lesson, we discussed white noise, random walk, and moving average (MA) time
series models. In this lesson, we will continue to explore more time series models. Specifically,
we will talk about the autoregressive model (AR model). We will go over the definition and
properties of the AR model. We will also compare the autoregressive model with the moving
average model. We will end the lesson with one application.
where C is the intercept and ε t is the normal white noise and absolute value of b 1<1 .
We then can apply OLS to estimate the coefficients and the residuals and to conduct model
diagnostics. The above regression model for time series X t is called an autoregressive
(AR) model.
α ( B ) X t =( 1 − α 1 B − α 2 B 2 −⋯ − α p B p ) X t =W t
There are restrictions for the value of coefficients α to make sure X t can stay stationary.
Where C is a constant and W t is normal white noise with mean = 0 and variance = σ 2
Now let’s look at AR(1)’s mean, variance, autocovariance, ACF, and PACF.
The mean function is:
C
μ=
1 −α 1
( )
2
σ |h)
γ ( h )= 2
α , where|h )=|s – t )
1− α
ϕ h h=
{
α for|h ) ≤ 1
0 for|h ) >1 )
2.3 AR(1) and AR(2) Examples
Figure 1: AR(1) with α =0.4 and AR(1) with α =− 0.4
Figure 1 gives us two AR(1) graph examples. They both move around the mean = 0 and
within a horizontal band. The AR(1) with negative α is choppier than AR(1) with positive α .
We know the correlation of two observations next to each other is autocorrelation ρ ( 1 ). For
AR(1) model, ρ ( 1 ) is α . Hence, when α is positive, the two observations next to each other
move in the same direction while when α is negative, the two observations next to each
other move in the opposite direction. That is why we see the plot on the left is smoother
than the plot on the right.
Figure 2: ACF and PACF of AR(1)
Figure 2 shows the ACF and PACF plots for AR(1) with α =0.4 . From both plots, we see ACF
and PACF for lag 1 are all significant and their values are between 0.5 and 0.6. Since the
example is simulated data from AR(1) with α =0.4 , they are random sample data, and ACF
and PACF usually are not exactly the same as 0.4.
Now, let’s try to simulate AR(2) and see its ACF and PACF plots. The following figure 3
shows the graphs for AR(2).
Figure 3: ACF and PACF of AR(2)
Let’s focus on the ACF and PACF plots for AR(2) in figure 3. In the ACF plot, we can see the
ACF decreases gradually. On the other hand, the PACF in the plot drops to almost 0 after lag
2. This interesting PACF behavior for AR(2) process will help us decide what time series
model we will use. We will discuss more about model selection later in this module.
where W is the normal independent white noise with mean = 0 and variance = 1. If and only
if |θ )< 1, we can write X t in an infinite AR form as follows:
2 3
X t =θ X t −1 −θ X t −2 +θ X t − 3 − ⋯− W t
1
Wt= X by rearranging the terms
1+θ B t
We also know the formula of infinite geometric series:
2 3 ∞ a
S∞ =a+ a r +a r +a r +⋯+ a r =
1− r
when |r )< 1.
1
If |θ )< 1, we can rewrite X as follows:
1+ θ B t
Xt 2 3 ∞
= X t + X t ( −θ B )+ X t ( −θ B ) + X t (− θ B ) +⋯+ X t ( −θ B )
1+ θ B
2 2 3 3 ∞ ∞
¿ X t − θ B X t +θ B X t − θ B X t +⋯+θ B X t
1
We then can plug this back into W t = X and rearrange the terms and we will get:
1+θ B t
2 2 3 3
X t =θ B X t −θ B X t +θ B X t +⋯+W t
Then, we can place the backshift operator back as the subscript for X t and we get:
2 3
X t =θ X t −1 −θ X t −2 +θ X t − 3+ ⋯+W t
Now we know that X t can be written as an infinite AR model from an MA(1) model.
The key point for MA(1) to be able to be represented as an infinite AR process is to make
sure its coefficient |θ )< 1. We call this MA(1) invertible if the MA(1) process has the
absolute value of coefficients less than 1 and can be represented by an infinite AR process.
The other benefit when an MA process is invertible is that the MA process is also unique.
Let’s see an example below to demonstrate this concept.
MA A: X t =W t +5 W t − 1 with W ∼ normal white noise ( 0 , 1 )
1
MA B : X t =W t + W t −1 with W ∼ normal white noise ( 0 , 25 )
5
Both MA A and MA B time series have the same autocovariance values shown as follows:
{ )
26 for h=0
γ ( h )= 5 for h=1
0 for h>1
In the real world, we only observe the autocovariance values of the time series, not the
actual model. We cannot tell if MA A or MA B is the correct one. Both MA A and MA B are the
model candidates for the underlying time series data. So which model should we choose? If
we need the MA model to be invertible, then we will limit our model choice to MA B , which
has the absolute value of coefficient less than 1. Then, we call the MA B model unique.
2.5 ACF and PACF Plots for AR and MA Processes
We briefly talked about the special features ACF and PACF plots exhibit for the AR process
earlier. In this section, we are going to compare ACF and PACF plots for both the AR
process and MA process together. We are going to use an AR(2) process and an MA(2)
process as examples here.
Figure 4: ACF and PACF Plots for AR(2) and MA(2)
In figure 4 above, the top two plots are ACF and PACF plots for an AR(2) process. The
bottom two plots are ACF and PACF plots for an MA(2) process. In an ACF plot, the farthest
left bar shows autocorrelation is 1 when h=0, which means it’s the correlation for the
observation itself. In the PACF plot, it starts with h=1, which means the farthest left bar is
for PACF lag 1.
For the AR(2) process, PACF plot shows a drop to almost 0 after lag 2. For the MA(2)
process, ACF plot shows a drop to almost 0 after lag 2. This is a particularly important
phenomenon. We can actually use this phenomenon to decide the model specification for a
time series. If the PACF plot of a time series shows a sudden drop in value to close to 0 after
lag p, we can assume the time series follows an AR( p) process. If the ACF plot of a time
series shows a sudden drop in value to close to 0 after lag q , we can assume this time series
follows an MA(q ) process. With the above information, we can use ACF and PACF plots to
decide which process and what lag to use to model a time series. Figure 5 summarizes what
we just discussed about ACF and PACF plots.
Figure 5: ACF and PACF Plot Features for AR(𝑝 ) and MA(𝑞)
( )
2
σ |h)
γ ( h )= 2
α
1− α
where $h = s – t $
We can use the above sample information and formulas to solve for AR(1) parameter α .
γ^( 1)
α^ =
^
γ ( 0)
Where W t is normal white noise with mean = 0 and variance = σ 2 and |α ) <1
With the above model, we can write our likelihood function as follows:
L ( α , σ 2) =f ( x1 , x2 , x 3 , ⋯ , x n ∨α , σ 2 )
We then maximize L ( α , σ 2) with respect to α and σ 2 to solve for α . All the statistical
applications can solve the optimization for us.
4. Conclusion
In this lesson, we went through the autoregressive model. We first give a brief description
of what an AR model is. We then provided the formal definition of an AR model, provided
some examples of AR processes, and gave the moments of an AR model. We then moved to
discuss how to use ACF and PACF plots to identify an MA model or an AR model. We also
explained why and when we can use an infinite AR process to represent an MA model.
Then, we talked about two estimation methods for an AR model: the Yule-Walker method
and maximum likelihood method. We ended the lesson with a simulated application to
show how to model an AR model and conduct post model diagnostics. In the next lesson,
we will introduce one comprehensive model and steps to run the model.
Copyright 2024 WorldQuant University. This content is licensed solely for personal use.
Redistribution or publication of this material is strictly prohibited.