Extrapolatable Transformer Pre-training for Ultra Long Time-Series Forecasting

29 Nov 2023  ·  Ziyang Song, Qincheng Lu, Hao Xu, David L. Buckeridge, Yue Li ·

Large-scale pre-trained models (PTMs) such as BERT and GPT have recently achieved great success in Natural Language Processing and Computer Vision domains. However, the development of PTMs on time-series data is lagging behind. This underscores the limitations of the existing transformer-based architectures, particularly their scalability to handle large-scale data and ability to capture long-term temporal dependencies. In this study, we present Timely Generative Pre-trained Transformer (TimelyGPT). TimelyGPT employs an extrapolatable position (xPos) embedding to encode trend and periodic patterns into time-series representations. It also integrates recurrent attention and temporal convolution modules to effectively capture global-local temporal dependencies. Our experiments show that TimelyGPT excels in modeling continuously monitored biosignals and irregularly-sampled time series data commonly observed in longitudinal electronic health records (EHRs). In ultra-long-term forecasting experiment, TimelyGPT achieves accurate extrapolation up to 6,000 timesteps of body temperature during the sleep stage transition given a short look-up window (i.e., prompt) containing only 2,000 timesteps. We further demonstrated TimelyGPT's forecasting capabilities on a preprocessed longitudinal healthcare administrative database called PopHR consisting of 489,000 patients randomly sampled from Montreal population. Together, we envision TimelyGPT to be useful in a broad spectrum of health domains including long-term patient health state forecasting and patient risk trajectory prediction.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods