Multi-Time Attention Networks for Irregularly Sampled Time Series

25 Jan 2021  ·  Satya Narayan Shukla, Benjamin M. Marlin ·

Irregular sampling occurs in many time series modeling applications where it presents a significant challenge to standard deep learning models. This work is motivated by the analysis of physiological time series data in electronic health records, which are sparse, irregularly sampled, and multivariate. In this paper, we propose a new deep learning framework for this setting that we call Multi-Time Attention Networks. Multi-Time Attention Networks learn an embedding of continuous-time values and use an attention mechanism to produce a fixed-length representation of a time series containing a variable number of observations. We investigate the performance of this framework on interpolation and classification tasks using multiple datasets. Our results show that the proposed approach performs as well or better than a range of baseline and recently proposed models while offering significantly faster training times than current state-of-the-art methods.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Time Series Classification PhysioNet Challenge 2012 mTAND-Full AUC 85.8% # 5
Time Series Classification PhysioNet Challenge 2012 mTAND-Enc AUC 85.4% # 6

Methods


No methods listed for this paper. Add relevant methods here