ARMA and ARIMA processes

Autoregressive and moving-average processes may be merged into ARMA (autoregressive moving-average) processes like:

images

The model above is referred to as ARMA(p, q) process, for self-explanatory reasons. Conditions ensuring stationarity have been developed for ARMA processes, as well as identification and estimation procedures. Clearly, the ARMA modeling framework affords us plenty of opportunities to fit historical data. However, it applies only to stationary data. It is not too difficult to find real-life examples of data processes that are nonstationary. Just think of stock market indices; most investors really wish that the process is not stationary.

images

Fig. 11.15 A sample path and the corresponding SACF for the random walk Yt = Yt−1 + 0.05ηt.

Example 11.12 (A nonstationary random walk) A quite common building block in many financial models is the random walk. An example of random walk is

images

where ηt is a sequence of independent and standard normal random variables. This is actually an AR process, but from Section 11.6.2 we know that it is nonstationary, as images = 1. A sample path of the process is shown in Fig. 11.15. In this figure, the nonstationarity of the process is pretty evident, but this need not always be the case. In figure 11.16 we show another sample path for the same process. A subjective comparison of the two sample paths would not suggest that they are just two realizations of the same stochastic process. However, the two autocorrelograms show a common pattern: Autocorrelation fades out slowly. Indeed, this is a common feature of nonstationary processes. Figure 11.17 shows the SPACF for the second sample path. We see a very strong partial autocorrelation at lag 1, which cuts off immediately. Again, this is a pattern corresponding to the process described by Eq. 11.36.

Since the theory of stationary MA and AR processes is well developed, it would be nice to find a way to apply it to nonstationary processes as well. A commonly used trick to remove nonstationarity in a time series is differencing, by which we consider the time series

images
images

Fig. 11.16 Another sample path and the corresponding SACF for the random walk Yt = Yt−1 + 0.05ηt.

images

Fig. 11.17 Sample partial autocorrelation function for the random walk Yt = Yt−1 + 0.05ηt.

Applying differencing to the sample path of Fig. 11.15 results in the sample path and SACF illustrated in figure 11.18. The shape of the SACF is not surprising, since the differenced process is just white noise.

Example 11.13 (What is nonstationarity, anyway?) A time series with trend

images

where images is white noise, is clearly nonstationary and features a deterministic trend. A little digression is in order to clarify the nature of nonstationarity in a random walk

images

Fig. 11.18 The effect of differencing on the sample random walk of Fig. 11.15.

images

The sample paths in Example 11.12 show that in the random walk does not feature a deterministic trend. Recursive unfolding of Eq. (11.39) results in

images

Therefore

images

Hence, we must have a different kind of nonstationarity in the random walk of Eq. (11.39) than in the process described by Eq. (11.38). To investigate the matter, let us consider the expected value of the increment images, conditional on Yt−1:

images

Therefore, given the last observation Yt−1, we cannot predict whether the time series will move up or down. Now, let us consider a stationary AR(1) process

images

where images ∈ (−1, 1). The increment in this case is

images

Since (images − 1) < 0, we have

images

This suggests that a stationary AR(1) process is mean reverting, in the sense that the process tends to return to its expected value; the nonstationary random walk does not enjoy this property.

If we introduce the backshift operator B, defined by

images

we may express the first difference in Eq. (11.37) as

images

Sometimes, differencing must be repeated in order to obtain a stationary time series. We obtain second-order differencing by repeated application of (first-order) differencing:

images

This suggests that we may formally apply the algebra of polynomials to the backshift operator, in order to find differences of arbitrary order. By introducing polynomials

images

we may rewrite the ARMA model of Eq. (11.35) in the compact form

images

We may extend the class of stationary ARMA models, in order to allow for nonstationarity. We find the more general class of ARIMA (autoregressive integrated moving average) processes, also known as Box–Jenkins models. An ARIMA(p, d, q) process can be represented as follows:15

images

where Φ(B) and Θ(B) are polynomials of order p and q, respectively, and d is a differencing order such that the process Yt is stationary, whereas, if we take differences of order (d – 1), the process is still nonstationary. The name “integrated” stems from the fact that we obtain the nonstationary process by integrating a stationary one, i.e., by undoing differencing. In most business applications the order d is 0 or 1. A full account of this class of models is beyond the scope of this book; we refer the reader to the bibliography provided at the end of where it is also shown how Box–Jenkins models can be extended to cope with seasonality.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *