A decoder-only foundation model for time-series forecasting

14 Oct 2023  ·  Abhimanyu Das, Weihao Kong, Rajat Sen, Yichen Zhou ·

Motivated by recent advances in large language models for Natural Language Processing (NLP), we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on a variety of public datasets comes close to the accuracy of state-of-the-art supervised forecasting models for each individual dataset. Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus, and can work well across different forecasting history lengths, prediction lengths and temporal granularities.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Time Series Forecasting ETTh1 (336) Multivariate TimesFM MAE 0.436 # 29

Methods


No methods listed for this paper. Add relevant methods here