Recurrent Neural Networks

A Gated Recurrent Unit, or GRU, is a type of recurrent neural network. It is similar to an LSTM, but only has two gates - a reset gate and an update gate - and notably lacks an output gate. Fewer parameters means GRUs are generally easier/faster to train than their LSTM counterparts.

Image Source: here

Source: Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation


Paper Code Results Date Stars


Task Papers Share
Time Series Analysis 49 6.07%
Decoder 41 5.08%
Speech Synthesis 39 4.83%
Language Modelling 25 3.10%
Sentence 25 3.10%
Time Series Forecasting 19 2.35%
Sentiment Analysis 19 2.35%
General Classification 19 2.35%
Classification 18 2.23%


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign