top of page
  • Writer's pictureMinh Nguyen

TIME SERIES MODELS

This Concept Map, created with IHMC CmapTools, has information related to: Time series models, Time series models e.g. Other, The auto-correlation function is a basic tool for Identification, Estimation generates at the end Forecasts, Simple exponential smoothing provide Forecasts, Identification involves finding the degree of Differencing, Smoothing methods e.g. Winter's exponential smoothing, A stationary processes can be obtained through Differencing, Moving average terms help to define a primary model for Estimation, Smoothing methods e.g. Moving averages, Identification involves finding Autoregressive terms, Smoothing methods e.g. Simple exponential smoothing, ARIMA uses The partial auto-correlation function, Forecasts can generate Combined forecasts, ARIMA uses The auto-correlation function, The partial auto-correlation function is a basic tool for Identification, Time series models use Time series data, Differencing generates a stationay process for Estimation, Moving averages provide Forecasts, Autoregressive terms help to define a primary model for Estimation, Differencing generates a stationary process from Non-stationary processes


1. Time series methods

Time series methods or models makes reference to models use in forecasting where no "explanatory" variables are involved. Hence, the only source of information is the past values of the variable of interest. The main objective of these methods is forecasting future values. These models include smoothing methods (moving averages, single and double exponential smoothing, Holt-Winters exponential smoothing), ARIMA (AutoRegressive Integrated Moving Average) models, and other models.


2. Components of a time series

Any time series can contain some or all of the following components: 1. Trend (T) 2. Cyclical (C) 3. Seasonal (S) 4. Irregular (I)

These components may be combined in different ways. It is usually assumed that they are multiplied or added, i.e.,

yt = T × C × S × I yt = T + C + S + I

To correct for the trend in the first case one divides the first expression by the trend (T). In the second case it is subtracted.

Trend component The trend is the long term pattern of a time series. A trend can be positive or negative depending on whether the time series exhibits an increasing long term pattern or a decreasing long term pattern.

If a time series does not show an increasing or decreasing pattern then the series is stationary in the mean.

Trend component The trend is the long term pattern of a time series. A trend can be positive or negative depending on whether the time series exhibits an increasing long term pattern or a decreasing long term pattern. If a time series does not show an increasing or decreasing pattern then the series is stationary in the mean.


Cyclical component Any pattern showing an up and down movement around a given trend is identified as a cyclical pattern. The duration of a cycle depends on the type of business or industry being analyzed.

Seasonal component Seasonality occurs when the time series exhibits regular fluctuations during the same month (or months) every year, or during the same quarter every year. For instance, retail sales peak during the month of December.

Irregular component This component is unpredictable. Every time series has some unpredictable component that makes it a random variable. In prediction, the objective is to “model” all the components to the point that the only component that remains unexplained is the random component.



SMOOTHING METHODS


1. Moving averages


A moving average is a method to obtain a smoother picture of the behavior of a series. The objective of applying moving averages to a series is to eliminated the irregular component, so that the process is clearer and easier to interpret.

A moving average can be calculated for the purpose of smoothing the original series, or to obtain a forecast. In the first case a “centered” moving average is calculated. In the second case, the forecast for period n is calculated with the m previous values, where m is the number of periods (the order of the moving average) that enter the calculation.

2. Simple moving average Two simple moving average processes (not centered, for forecasting purposes) of order 3, and 5 are presented below.

3. Weighted moving average A weighted moving average can be produced by repeated application of a simple averaging. For instance, for a moving average of order 3, applying a moving average again yields:


SIMPLE EXPONENTIAL SMOOTHING


Forecasts generated with this method are a weighted average of the past values of the variable. The weights decline for older observations. The rationale is that more recent observations are more influential than older observations. The forecast for period t + 1 calculated in period t is called Ft+1. Therefore, Ft is the forecast for period t calculated in period t − 1. The forecast for period t + 1 is,


Ft +1 = αAt + (1 − α)Ft

which represents a weighted average of the actual value (At) and the forecast (Ft) of the actual value (calculated at t − 1). The higher the value of alpha the more weight is given to current values (the shorter the memory of the process).

FORECASTS


The forecast process The final goal of a forecast is to make decisions based on the future value(s) of some variable. The forecast process can be presented with the following outline. 1. Specify objective 2. Define variable to forecast 3. Establish periodicity 4. Data considerations: measurement (index, units, dollars) 5. Model selection: regression, time series models, other 6. Model evaluation 7. Forecast preparation 8. Forecast presentation. Clear, nontechnical presentations are crucial in the communication of the forecast results. 9. Tracking results. Tracking the performance of the forecasts helps to redefine, respecify the model, or to replace the estimation method used.

The following table presents a guideline of the different forecasting methods based on different conditions.


Combined Forecasts

Combination of forecasts When more than one forecast are available the combination of those forecasts has the advantage of reducing the forecast error. Assume that two forecasts are available (a linear regression model, and an exponential smoothing forecast). Denote them by f1 and f2. Then, the combined forecast (fc) can be obtained as a weighted average of both forecasts. That is,

One way to select the weights w1 and w2 is by looking at the past performance of both methods and assigning the larger weight to that method that perform the best, for instance based on the value of the MSE (mean squared error). The weight can be proportional to the inverse of the MSE, and can be calculated in the following way:


Turning Points

Turning points - Graphical analysis

The analysis of turning points (actual and predicted) allows the manager to judge which model predicts downturns and upturns with more accuracy. The graphical analysis consists of plotting the actual change in the variable to predict versus the predicted change. The actual and predicted or forecast changes are defined as follows:

The results of the forecasts exercise include a correct prediction of the direction of change (positive or negative), or a turning-point error. The possible results with the errors in forecasting changes can be categorized as follows:

1. Correct prediction of direction of change. (a) Overestimation of a positive change (b) Underestimation of a positive change (c) Overestimation of a negative change (d) Underestimation of a negative change

2. Turning-point errors. (a) Prediction of an upturn that did not occur (NT) (b) Failure to predict a downturn (TN) (c) Prediction of a downturn that did not occur (NT) (d) Failure to predict an upturn (TN)

These results on turning-point errors can be summarized in a table, as follows.

Turning-point errors


The number of correct forecasts is the sum of the main diagonal occurrences (NN+TT). NT represents false signals and TN refers to missed turning points. The proportion of false signals is defined as

Evaluation of forecasts Forecasts are evaluated using different measures based on the difference between the actual and the predicted value (the residual). Among these measures, the following are the most frequently used.

(a) MAPE: mean absolute percentage error

This method is useful when the units of measure of Yt are relatively large. (b) MSE: mean squared error


This measure is useful when the managers are interested in minimizing the occurrence of a major error. This measure magnifies large errors. However one might tend to select a model with an error pattern of 10,1,1,1,1, rather than one with an error pattern of 5,5,5,5,5 (one with a few large errors versus one with a smaller systematic error). This method does not indicate whether the model is systematically underestimating or overestimating the actual values. (c) RMSE: root mean squared error


(d) MAD: mean absolute deviation


Validation The goal of the forecast exercise is to obtain accurate predictions. In measuring the accuracy of the prediction the forecaster usually relies on the performance of the model using past information. This is basically the assumption adopted in the previous section.

A better approximation to measure how accurate a model predicts is to use only part of the sample and validate the model using the hold out sample. In this case the ratios based on forecast errors are calculated using the number of data points predicted for the hold out sample points. This provides a more reliable measure of the quality of the forecast for each model.



1 view0 comments

Comments


bottom of page