iLMS知識社群ePortfolioeeClass學習平台空大首頁Login
Position: 吳俊逸 > AI
ML Methods
by 吳俊逸 2020-05-07 12:19:07, Reply(0), Views(164)
Ref: https://thenewstack.io/when-holt-winters-is-better-than-machine-learning/

Machine Learning (ML) gets a lot of hype, but its classical predecessors are still immensely powerful, especially in the time-series space. Error, Trend, Seasonality Forecast (ETS), Autoregressive Integrated Moving Average (ARIMA) and Holt-Winters are three Classical methods that are not only incredibly popular but are also excellent time-series predictors.

In fact, according to Statistical and Machine Learning forecasting methods: Concerns and ways forward, ETS outperforms several other ML methods including Long Short Term Memory (LTSM) and Recurrent Neural Networks (RNN) in One-Step Forecasting. Actually, all of the statistical methods have a lower prediction error than the ML methods do.

A bar chart comparing errors for one-step forecasts. Taken from “Statistical and Machine Learning forecasting methods: Concerns and Ways forward.”

Anais Dotis-Georgiou
Anais is a developer advocate for InfluxData with a passion for making data beautiful with the use of data analytics, AI and machine learning. She takes the data that she collects and does a mix of research, exploration and engineering to translate it into something of function, value and beauty. When she is not behind a screen, you can find her outside drawing, stretching or chasing after a soccer ball.

My hope is that after finishing this three-part blog post series, you’ll have a strong conceptual and mathematical understanding of how Holt-Winters works. I focus on Holt-Winters for three reasons. First, Holt-Winters, or Triple Exponential Smoothing, is a sibling of ETS. If you understand Holt-Winters, then you will easily be able to understand the most powerful prediction method for time series data (among the methods above). Second, you can use Holt-Winters out of the box with InfluxDB. Finally, the InfluxData community has requested an explanation of Holt-Winters in this Github issue 459. Luckily for us, Holt-Winters is fairly simple, and applying it with InfluxDB is even easier.

In this post, I’ll show you:

  1. When to use Holt-Winters.
  2. How Single-Exponential Smoothing works.
  3. A conceptual overview of optimization for Single Exponential Smoothing.
  4. Extra: The Proof for Optimization of Residual Sum of Squares (RSS) for Linear Regression.

In part two, I’ll show you:

  1. How Single Exponential Smoothing relates to Triple Exponential; Smoothing/Holt-Winters.
  2. How RSS relates to Root Mean Square Error (RMSE).
  3. How RMSE is optimized for Holt-Winters using the Nelder-Mead method.

In part three, I’ll show you:

  1. How you can use InfluxDB’s built-in Multiplicative Holt-Winters function to generate predictions on your time-series data.
  2. A list of learning resources.

When to Use Holt-Winters

Before you select any prediction method, you need to evaluate the characteristics of your dataset. To determine whether your time series data is a good candidate for Holt-Winters or not, you need to make sure that your data:

  • Isn’t stochastic. If it is random, then it’s actually a good candidate for Single Exponential Smoothing.
  • Has a trend.
  • Has seasonality. In other words, your data has patterns at regular intervals. For example, if you were monitoring traffic data, you would see a spike in the middle of the day and a decrease in activity during the night. In this case, your seasonal period might be one day.