Recurrent Dropout is a regularization method for recurrent neural networks. Dropout is applied to the updates to LSTM memory cells (or GRU states), i.e. it drops out the input/update gate in LSTM/GRU.
Source: Recurrent Dropout without Memory LossPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
General Classification | 1 | 25.00% |
Text Classification | 1 | 25.00% |
Language Modeling | 1 | 25.00% |
Language Modelling | 1 | 25.00% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |