Exponential Smoothing Method: A Basic Overview In 3 Points

Introduction

Exponential smoothing method is a method used in forecasting univariate data using a time series. The method supports data with a seasonal component or systematic trend and uses past observations to make predictions. Being an alternative to the Box-Jenkins ARIMA family’s popular methods, it is popular as an alternative forecasting method. 

  1. What Is Exponential Smoothing?
  2. Types of Exponential Smoothing
  3. How to Configure Exponential Smoothing

1. What Is Exponential Smoothing?

The Box-Jenkins ARIMA methods use a model based on the predictions being the linear weighted sum of lags or the most recent observations.  The method of exponential smoothing also uses a time series to forecast predictions where the data is univariate. The exponential smoothing method model makes the predictions using weights that decrease exponentially as their past observations.

These past observations have geometrically decreasing weights and, like the ARIMA methods, are the past observations. The most recent observation always has the highest weight in the method. Note that these weights decay exponentially and that the smoothing methods use the summing of weighted past-observation averages. These methods are older than the ARIMA Box-Jenkins methods class, and together these smoothing models are also called the ETS (Error, Trend, Seasonality) models. 

2. Types of Exponential Smoothing

The ETS model is a simple exponential smoothing method that handles trends, has no systematic structure, and provides seasonality. The 3 main categories of exponential smoothing forecasting methods in time series are explained below.

  • SES- Single Exponential Smoothing: The exponential smoothing method uses data without seasonality, trends, and a single variable. The smoothing coefficient or smoothing factor for that level is the single parameter/ hyperparameter denoted by (a) or alpha which controls the exponential decay influencing rate of past observations. It is set to values between one and zero, where the larger values of the predictive model mean it focuses on past observations, and the smaller values focus on history. Learning is fastest at one and slowest at zero values of alpha.
  • DES- Double Exponential Smoothing: This exponential smoothing method adds in trends support to the SES model. The hyperparameters here are the controlling smoothing factor for a particular level denoted by (a) alpha and the additional factor for smoothing and decay control of trend changes denoted by (b) or beta. The trends can change additively if the trend is linear and multiplicatively when the trend is exponential. The additive model was founded by Charles Holt and is also known as Holt’s linear trends model. When the forecasts have multiple forecasting steps, the trends tend to become unrealistic and hence can be used for trend damping in time-series models. The 5 types of hyperparameters here are
  • Level smoothing factor Alpha
  • Trend smoothing factor Beta
  • Multiplicative or Additive Trend Type
  • Multiplicative or Additive Dampening Type
  • Damping coefficient Phi.
  • TES-Triple Exponential Smoothing: This exponential smoothing method adds in a seasonality factor to the univariate time series and is called Exponential Smoothing using the Holt-Winters method after Peter Winters and Charles Holt. The seasonality coefficient is (g) gamma and controls the seasonal component, multiplicatively or additively processed for exponential or linear trend changes. It can be used adaptively to develop the SES and DES models too. Note that the time period steps also need to be specified when working with seasonal models. Its contributor hyperparameters are in addition to the DES parameters, the time period’s parameters, and seasonality type. 

3. How to Configure Exponential Smoothing

Single exponential smoothing configuration can be quite complex and require the model’s hyperparameters to be explicitly specified. In time the best method that has evolved in the use of estimated numerical optimization resulting in the lowest error when funding and searching for the model’s 4 smoothing coefficients. The trend and seasonality parameters should also be explicitly stated for a robust model and are easily done in Python using the SSE (sum of squared errors) method of estimation and minimizing errors. 

  • Exponential Smoothing in Python:

Exponential Smoothing implementation in Python can be found in the library of Statsmodels Python. This method is described by George Athana­sopou­los and Rob Hyndman in their 2013 book Forecasting: Practice and Principles. The R implementation can also be found in the package tabbed “forecast”.

  • Single Exponential Smoothing

Simple smoothing in Python is undertaken using the class SimpleExpSmoothing Statsmodels, where the class is passed with the training data and instantiated before using the fit() function for configuration. Next, specify smoothing_level coefficient alpha values. If this is set to none specified, the model optimizes automatically and returns the HoltWintersResults of an instance in that class with the coefficients learned values. Use the predict() or forecast() function on the object, resulting in a forecast.

  • Double and Triple Exponential Smoothing

Single, Double and Triple exponential smoothing methods can be implemented in Python using the ExponentialSmoothing Statsmodels class. As already seen, the additional steps required here besides those of the exponential smoothing example are the parameter configurations adding in the 

  • trend: The trend component with the additive “add” or multiplicative “mul” being set or disabled with the None value.
  • damped: Use True for damping and False for undamped trend components.
  • seasonal: The seasonal component with the additive “add” or multiplicative “mul” being set or disabled with the None value.
  • seasonal_periods: Specify the number of steps involved. Ex: 1 to 12 for the months etc.

The exponential smoothing method model is then training data fit and the coefficients phi- damping_slope, alpha- smoothing_level, gamma- smoothing_seasonal, beta- smoothing_slope specified. If the ‘None’ value is used, self-learning of coefficients occurs automatically. Besides, it can also decide to transform the series using use_boxcox with the specification of l-lambda the transform coefficient. Transformation here means the basic data is specifically prepared or transformed depending on the transform coefficient specified. It then returns the HoltWintersResults of an instance in that class that has the coefficients learned values. Use the predict() or forecast() function on the object resulting to call for a forecast.

Conclusion

Thus we see how exponential smoothing forecasting is a very effective tool in univariate data time series forecasts and predictions. The various exponential smoothing methods described above provide sufficient and various methods for the configuration in Python and R suites to make automatic forecasts when the method’s coefficients or hyperparameters are properly specified.

There are no right or wrong ways of learning AI and ML technologies – the more, the better! These valuable resources can be the starting point for your journey on how to learn Artificial Intelligence and Machine Learning. Do pursuing AI and ML interest you? If you want to step into the world of emerging tech, you can accelerate your career with this Machine Learning And AI Courses by Jigsaw Academy.

ALSO READ

Related Articles

loader
Please wait while your application is being created.
Request Callback