Insights into LSTM Fully Convolutional Networks for Time Series Classification

Long Short Term Memory Fully Convolutional Neural Networks (LSTM-FCN) and Attention LSTM-FCN (ALSTM-FCN) have shown to achieve state-of-the-art performance on the task of classifying time series signals on the old University of California-Riverside (UCR) time series repository. However, there has been no study on why LSTM-FCN and ALSTM-FCN perform well... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Max Pooling
Pooling Operations
Sigmoid Activation
Activation Functions
Tanh Activation
Activation Functions
Convolution
Convolutions
Concatenated Skip Connection
Skip Connections
Batch Normalization
Normalization
ReLU
Activation Functions
GRU
Recurrent Neural Networks
Dense Block
Image Model Blocks
FCN
Semantic Segmentation Models
LSTM
Recurrent Neural Networks