# 8/3 Autoregressive models

In a mul­ti­ple regres­sion model, we fore­cast the vari­able of inter­est using a lin­ear com­bi­na­tion of pre­dic­tors. In an autore­gres­sion model, we fore­cast the vari­able of inter­est using a lin­ear com­bi­na­tion of past val­ues of the vari­able. The term autoregres­sion indi­cates that it is a regres­sion of the vari­able against itself.

Thus an autore­gres­sive model of order can be writ­ten as

where is white noise. This is like a mul­ti­ple regres­sion but with lagged val­ues of as pre­dic­tors. We refer to this as an AR() model.

Autore­gres­sive mod­els are remark­ably flex­i­ble at han­dling a wide range of dif­fer­ent time series pat­terns. The two series in Fig­ure 8.5 show series from an AR(1) model and an AR(2) model. Chang­ing the para­me­ters results in dif­fer­ent time series pat­terns. The vari­ance of the error term will only change the scale of the series, not the patterns.

Fig­ure 8.5: Two exam­ples of data from autore­gres­sive mod­els with dif­fer­ent para­me­ters. Left: AR(1) with yt=18–0.8yt-1+et. Right: AR(2) with yt=8+1.3yt-1–0.7yt-2+et. In both cases, et is nor­mally dis­trib­uted white noise with mean zero and vari­ance one.

For an AR(1) model:

• When , is equiv­a­lent to WN
• When and , is equiv­a­lent to a RW
• When and , is equiv­a­lent to a RW with drift
• When , tends to oscil­late between pos­i­tive and neg­a­tive values.

We nor­mally restrict autore­gres­sive mod­els to sta­tion­ary data, and then some con­straints on the val­ues of the para­me­ters are required.

• For an AR(1) model:   .
• For an AR(2) model:   ‚   ‚   .

When the restric­tions are much more com­pli­cated. R takes care of these restric­tions when esti­mat­ing a model.

Pro­ceed to Sec­tion 8/4.