In a multiple regression model, we forecast the variable of interest using a linear combination of predictors. In an autoregression model, we forecast the variable of interest using a linear combination of past values of the variable. The term autoregression indicates that it is a regression of the variable against itself.
Thus an autoregressive model of order can be written as
where is white noise. This is like a multiple regression but with lagged values of as predictors. We refer to this as an AR() model.
Autoregressive models are remarkably flexible at handling a wide range of different time series patterns. The two series in Figure 8.5 show series from an AR(1) model and an AR(2) model. Changing the parameters results in different time series patterns. The variance of the error term will only change the scale of the series, not the patterns.
For an AR(1) model:
- When , is equivalent to WN
- When and , is equivalent to a RW
- When and , is equivalent to a RW with drift
- When , tends to oscillate between positive and negative values.
We normally restrict autoregressive models to stationary data, and then some constraints on the values of the parameters are required.
- For an AR(1) model: .
- For an AR(2) model: ‚ ‚ .
When the restrictions are much more complicated. R takes care of these restrictions when estimating a model.
Proceed to Section 8/4.