Sentence Embeddings in NLI with Iterative Refinement Encoders

27 Aug 2018Aarne TalmanAnssi Yli-JyräJörg Tiedemann

Sentence-level representations are necessary for various NLP tasks. Recurrent neural networks have proven to be very effective in learning distributed representations and can be trained efficiently on natural language inference tasks... (read more)

PDF Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT LEADERBOARD
Natural Language Inference SciTail Hierarchical BiLSTM Max Pooling Accuracy 86.0 # 4
Natural Language Inference SNLI 600D Hierarchical BiLSTM with Max Pooling (HBMP, code) % Test Accuracy 86.6 # 26
% Train Accuracy 89.9 # 35
Parameters 22m # 2