Recurrent Neural Networks

Long Short-Term Memory

An LSTM is a type of recurrent neural network that addresses the vanishing gradient problem in vanilla RNNs through additional cells, input and output gates. Intuitively, vanishing gradients are solved through additional additive components, and forget gate activations, that allow the gradients to flow through the network without vanishing as quickly.

(Image Source here)

(Introduced by Hochreiter and Schmidhuber)

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Time Series 63 9.72%
Language Modelling 24 3.70%
Speech Recognition 19 2.93%
Sentiment Analysis 18 2.78%
Text Classification 15 2.31%
Machine Translation 15 2.31%
Time Series Forecasting 14 2.16%
Text Generation 12 1.85%
Anomaly Detection 11 1.70%

Categories