Regularization

Recurrent Dropout

Introduced by Semeniuta et al. in Recurrent Dropout without Memory Loss

Recurrent Dropout is a regularization method for recurrent neural networks. Dropout is applied to the updates to LSTM memory cells (or GRU states), i.e. it drops out the input/update gate in LSTM/GRU.

Source: Recurrent Dropout without Memory Loss

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
General Classification 1 33.33%
Text Classification 1 33.33%
Language Modelling 1 33.33%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories