Recurrent Neural Networks

A Gated Recurrent Unit, or GRU, is a type of recurrent neural network. It is similar to an LSTM, but only has two gates - a reset gate and an update gate - and notably lacks an output gate. Fewer parameters means GRUs are generally easier/faster to train than their LSTM counterparts.

Image Source: here

Source: Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Time Series Analysis 41 4.52%
Decoder 36 3.96%
Text to Speech 34 3.74%
Speech Synthesis 33 3.63%
Prediction 26 2.86%
Deep Learning 24 2.64%
Time Series Forecasting 20 2.20%
Language Modelling 20 2.20%
Sentiment Analysis 18 1.98%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories