A finite-order moving-average process of order q, denoted by MA(q), can be expressed as
where random variables are white noise, with and . These variables play the role of random shocks and drive the process. It is fairly easy to see that the process is weakly stationary. A first observation is that expected value and variance are constant:
The calculation of autocovariance is a bit more involved, but we may take advantage of the uncorrelation of white noise:
Fig. 11.9 Sample path and corresponding SACF for the moving-average process .
As a consequence, the autocorrelation function is
Thus, the autocorrelation function depends only on the lag k. We also notice that the autocorrelation function cuts off for lags larger than the order of the process. This makes sense, since the process Yt is a moving average of the driving process . Hence, by checking whether the sample autocorrelation function cuts off after a time lag, we may figure out whether a time series can be modeled as a moving average, as well as its order q. Of course, the sample autocorrelation will not be exactly zero for k > q; nevertheless, by using the autocorrelogram and its significance bands, we may get some clue.
Example 11.9 Let us consider a simple MA(1) process
where is a sequence of uncorrelated standard normal variables (Gaussian white noise). In Fig. 11.9 we show a sample path obtained by Monte Carlo simulation, and the corresponding sample autocorrelogram. The sample autocorrelation looks significant at time lag 1, which is expected, given the nature of the process. Note that, by applying Eq. (11.30), we find that the autocorrelation function, for a MA(1) process , is
Fig. 11.10 Sample path and corresponding SACF for the moving-average process .
Figure 11.10 shows the sample path and autocorrelogram of a slightly different MA(1) process:
The change in sign in θ1 has an effect on the sample path; an upswing tends to be followed by a downswing, and vice versa. The autocorrelogram shows a cutoff after time lag 1, and a negative autocorrelation.
If we increase the order of the process, we should expect more significant autocorrelations. In Fig. 11.11, we repeat the exercise for the MA(2) process
We notice that, in this case, the autocorrelation function cuts off after time lag k = 2.
We should mention that sample autocorrelograms are a statistical tool. It may well be the case that, for the moving-average processes in the example, we get a different picture. This is a useful experiment to carry out with the help of statistical software.
Leave a Reply