Regularization

Fraternal Dropout

Introduced by Zolna et al. in Fraternal Dropout

Fraternal Dropout is a regularization method for recurrent neural networks that trains two identical copies of an RNN (that share parameters) with different dropout masks while minimizing the difference between their (pre-softmax) predictions. This encourages the representations of RNNs to be invariant to dropout mask, thus being robust.

Source: Fraternal Dropout

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Text Generation 1 33.33%
Image Captioning 1 33.33%
Language Modelling 1 33.33%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories