Modeling Variable Space with Residual Tensor Networks for Multivariate Time Series

29 Sep 2021  ·  Jing Zhang, Peng Zhang, Yupeng He, Siwei Rao, Jun Wang, Guangjian Tian ·

Multivariate time series involve a series of valuable applications in the real world, and the basic premise of which is that multiple variables are interdependent. However, the relationship between variables in the latent space is dynamic and complex, and as the time window increases, the size of the space also increases exponentially. For fully exploiting the dependencies in the variable space, we propose Modeling Variable Space with Residual Tensor Networks (MVSRTN) for multivariate time series. In this framework, we derive the mathematical representation of the variable space, and then use a tensor network based on the idea of low-rank approximation to model the variable space. The tensor components are shared to ensure the translation invariance of the network. In order to improve the ability to model long-term sequences, we propose an N-order residual connection approach and couple it to the space-approximated tensor network. Moreover, the series-variable encoder is designed to improve the quality of the variable space, and we use the skip-connection layer to achieve the dissemination of information such as scale. Experimental results verify the effectiveness of our proposed method on four multivariate time series forecasting benchmark datasets.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods