Shortcut-Stacked Sentence Encoders for Multi-Domain Inference

WS 2017  ·  Yixin Nie, Mohit Bansal ·

We present a simple sequential sentence encoder for multi-domain natural language inference. Our encoder is based on stacked bidirectional LSTM-RNNs with shortcut connections and fine-tuning of word embeddings. The overall supervised model uses the above encoder to encode two input sentences into two vectors, and then uses a classifier over the vector combination to label the relationship between these two sentences as that of entailment, contradiction, or neural. Our Shortcut-Stacked sentence encoders achieve strong improvements over existing encoders on matched and mismatched multi-domain natural language inference (top non-ensemble single-model result in the EMNLP RepEval 2017 Shared Task (Nangia et al., 2017)). Moreover, they achieve the new state-of-the-art encoding result on the original SNLI dataset (Bowman et al., 2015).

PDF Abstract WS 2017 PDF WS 2017 Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Natural Language Inference SNLI 600D Residual stacked encoders % Test Accuracy 86.0 # 62
% Train Accuracy 91.0 # 40
Parameters 29m # 4
Natural Language Inference SNLI 300D Residual stacked encoders % Test Accuracy 85.7 # 67
% Train Accuracy 89.8 # 47
Parameters 9.7m # 4

Methods


No methods listed for this paper. Add relevant methods here