Category: Time Series Models
-
Using time series models for forecasting
fTime series models may be used for forecasting purposes. As usual, we should find not only a point forecast, but also a prediction interval. Given an information set consisting of observations up to Yt, we wish to find a forecast , at time t, with horizon h ≥ 1, that is “best” in some well specified sense. A reasonable criterion…
-
ARMA and ARIMA processes
Autoregressive and moving-average processes may be merged into ARMA (autoregressive moving-average) processes like: The model above is referred to as ARMA(p, q) process, for self-explanatory reasons. Conditions ensuring stationarity have been developed for ARMA processes, as well as identification and estimation procedures. Clearly, the ARMA modeling framework affords us plenty of opportunities to fit historical…
-
Autoregressive Processes
In finite-order moving-average processes, only a finite number of past realizations of white noise influence the value of Yt. This may be a limitation for those processes in which all of the previous realizations have an effect, even though this possibly fades in time. This consideration led us from forecasting using simple moving averages to exponential…
-
Moving-average processes
A finite-order moving-average process of order q, denoted by MA(q), can be expressed as where random variables are white noise, with and . These variables play the role of random shocks and drive the process. It is fairly easy to see that the process is weakly stationary. A first observation is that expected value and variance are constant: The calculation of…
-
A GLANCE AT ADVANCED TIME SERIES MODELING
The class of exponential smoothing methods was born out of heuristic intuition, even though methodological frameworks were later developed to provide them with a somewhat more solid justification. Despite these efforts, exponential smoothing methods do suffer from at least a couple of drawbacks: It is also worth noting that simple linear regression models share some…
-
Smoothing with trend and multiplicative seasonality
The last exponential smoothing approach we consider puts everything together and copes with additive trend and multiplicative seasonality. The Holt–Winter method is based on Eq. (11.13), which we repeat for convenience: The overall scheme uses three smoothing coefficients and it proceeds as follows All of the remarks we have made about simpler versions of exponential smoothing apply here…
-
Smoothing with multiplicative seasonality
In this section we consider the case of pure seasonality. Forecasts are based on the demand model of Eq. (11.13), in which the trend parameter is set to : where s is the length of the seasonal cycle, i.e., a whole cycle consists of s time buckets.12 To get a grip of this model, imagine a yearly cycle consisting of 12 monthly…
-
Smoothing with trend
Demand may exhibit additive trend components that, in a static case, could be represented by the following demand model where B is the level and T is the trend. Looking at the demand model, linear regression seems a natural candidate to estimate these two parameters. However, level and trend might change over time, suggesting the opportunity of a dynamic…
-
Stationary demand: initialization and choice of α
One obviously weird feature of Eq. (11.18) is that it involves an infinite sequence of observations. However, in real life we do not have an infinite number of observations; the sum must be truncated somewhere in the past, right before we started collecting information. The oldest term in the average, in practice, corresponds to the initialization of the algorithm. To…
-
Stationary demand: three views of a smoother
In this section, we deal with the case of stationary demand, as represented by Eq. (11.14). In simple exponential smoothing we estimate the level parameter Bt by a mix of new and old information: where α is a coefficient in the interval [0, 1]. In (11.16), the new information consists of the last observation of demand Yt, and the old information consists of…