Cell-aware Stacked LSTMs for Modeling Sentences

7 Sep 2018  ·  Jihun Choi, Taeuk Kim, Sang-goo Lee ·

We propose a method of stacking multiple long short-term memory (LSTM) layers for modeling sentences. In contrast to the conventional stacked LSTMs where only hidden states are fed as input to the next layer, the suggested architecture accepts both hidden and memory cell states of the preceding layer and fuses information from the left and the lower context using the soft gating mechanism of LSTMs. Thus the architecture modulates the amount of information to be delivered not only in horizontal recurrence but also in vertical connections, from which useful features extracted from lower layers are effectively conveyed to upper layers. We dub this architecture Cell-aware Stacked LSTM (CAS-LSTM) and show from experiments that our models bring significant performance gain over the standard LSTMs on benchmark datasets for natural language inference, paraphrase detection, sentiment classification, and machine translation. We also conduct extensive qualitative analysis to understand the internal behavior of the suggested approach.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Paraphrase Identification Quora Question Pairs Bi-CAS-LSTM Accuracy 88.6 # 14
Natural Language Inference SNLI 300D 2-layer Bi-CAS-LSTM % Test Accuracy 87 # 47
Sentiment Analysis SST-2 Binary classification Bi-CAS-LSTM Accuracy 91.3 # 54
Sentiment Analysis SST-5 Fine-grained classification Bi-CAS-LSTM Accuracy 53.6 # 10

Methods