An Exploration of Dropout with RNNs for Natural Language Inference

22 Oct 2018Amit GajbhiyeSardar JafNoura Al MoubayedA. Stephen McGoughSteven Bradley

Dropout is a crucial regularization technique for the Recurrent Neural Network (RNN) models of Natural Language Inference (NLI). However, dropout has not been evaluated for the effectiveness at different layers and dropout rates in NLI models... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.