Multi-variable LSTM neural network for autoregressive exogenous model

In this paper, we propose multi-variable LSTM capable of accurate forecasting and variable importance interpretation for time series with exogenous variables. Current attention mechanism in recurrent neural networks mostly focuses on the temporal aspect of data and falls short of characterizing variable importance. To this end, the multi-variable LSTM equipped with tensorized hidden states is developed to learn hidden states for individual variables, which give rise to our mixture temporal and variable attention. Based on such attention mechanism, we infer and quantify variable importance. Extensive experiments using real datasets with Granger-causality test and the synthetic dataset with ground truth demonstrate the prediction performance and interpretability of multi-variable LSTM in comparison to a variety of baselines. It exhibits the prospect of multi-variable LSTM as an end-to-end framework for both forecasting and knowledge discovery.

Results in Papers With Code
(↓ scroll down to see all results)