1 code implementation • 10 Dec 2023 • Li Shen, Yuning Wei, Yangzhu Wang, Hongguang Li
With the development of Internet of Things (IoT) systems, precise long-term forecasting method is requisite for decision makers to evaluate current statuses and formulate future policies.
1 code implementation • 17 Jul 2023 • Li Shen, Yuning Wei, Yangzhu Wang
It decouples the prediction process of TSFT into two stages, including Auto-Regression stage and Self-Regression stage to tackle the problem of different statistical properties between input and prediction sequences. Prediction results of Auto-Regression stage serve as a Good Beginning, i. e., a better initialization for inputs of Self-Regression stage.
1 code implementation • 19 Jun 2023 • Li Shen, Yuning Wei, Yangzhu Wang, Huaxin Qiu
Moreover, we propose focal input sequence decomposition method which decomposes input sequence in a focal manner for efficient and robust forecasting when facing Long Sequence Time series Input (LSTI) problem.
1 code implementation • 22 Jul 2022 • Li Shen, Yuning Wei, Yangzhu Wang
Thanks to the core idea of respecting time series properties, no matter in which forecasting format, RTNet shows obviously superior forecasting performances compared with dozens of other SOTA time series forecasting baselines in three real-world benchmark datasets.
2 code implementations • 29 Aug 2021 • Li Shen, Yangzhu Wang
To address this issue, we propose the concept of tightly-coupled convolutional Transformer(TCCT) and three TCCT architectures which apply transformed CNN architectures into Transformer: (1) CSPAttention: through fusing CSPNet with self-attention mechanism, the computation cost of self-attention mechanism is reduced by 30% and the memory usage is reduced by 50% while achieving equivalent or beyond prediction accuracy.