Cell-aware Stacked LSTMs for Modeling Sentences

7 Sep 2018 Jihun Choi Taeuk Kim Sang-goo Lee

We propose a method of stacking multiple long short-term memory (LSTM) layers for modeling sentences. In contrast to the conventional stacked LSTMs where only hidden states are fed as input to the next layer, the suggested architecture accepts both hidden and memory cell states of the preceding layer and fuses information from the left and the lower context using the soft gating mechanism of LSTMs... (read more)

PDF Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Paraphrase Identification Quora Question Pairs Bi-CAS-LSTM Accuracy 88.6 # 6
Natural Language Inference SNLI 300D 2-layer Bi-CAS-LSTM % Test Accuracy 87 # 24
Sentiment Analysis SST-2 Binary classification Bi-CAS-LSTM Accuracy 91.3 # 23
Sentiment Analysis SST-5 Fine-grained classification Bi-CAS-LSTM Accuracy 53.6 # 6

Methods used in the Paper