Neural forecasting at scale

20 Sep 2021  ·  Philippe Chatigny, Shengrui Wang, Jean-Marc Patenaude, Boris N. Oreshkin ·

We study the problem of efficiently scaling ensemble-based deep neural networks for multi-step time series (TS) forecasting on a large set of time series. Current state-of-the-art deep ensemble models have high memory and computational requirements, hampering their use to forecast millions of TS in practical scenarios. We propose N-BEATS(P), a global parallel variant of the N-BEATS model designed to allow simultaneous training of multiple univariate TS forecasting models. Our model addresses the practical limitations of related models, reducing the training time by half and memory requirement by a factor of 5, while keeping the same level of accuracy in all TS forecasting settings. We have performed multiple experiments detailing the various ways to train our model and have obtained results that demonstrate its capacity to generalize in various forecasting conditions and setups.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods