Multi-Granular Sequence Encoding via Dilated Compositional Units for Reading Comprehension

EMNLP 2018  ·  Yi Tay, Anh Tuan Luu, Siu Cheung Hui ·

Sequence encoders are crucial components in many neural architectures for learning to read and comprehend. This paper presents a new compositional encoder for reading comprehension (RC). Our proposed encoder is not only aimed at being fast but also expressive. Specifically, the key novelty behind our encoder is that it explicitly models across multiple granularities using a new dilated composition mechanism. In our approach, gating functions are learned by modeling relationships and reasoning over multi-granular sequence information, enabling compositional learning that is aware of both long and short term information. We conduct experiments on three RC datasets, showing that our proposed encoder demonstrates very promising results both as a standalone encoder as well as a complementary building block. Empirical results show that simple Bi-Attentive architectures augmented with our proposed encoder not only achieves state-of-the-art / highly competitive results but is also considerably faster than other published works.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Question Answering NarrativeQA BiAttention + DCU-LSTM BLEU-1 36.55 # 6
BLEU-4 19.79 # 6
METEOR 17.87 # 6
Rouge-L 41.44 # 7
Open-Domain Question Answering SearchQA Bi-Attention + DCU-LSTM Unigram Acc 49.4 # 2
N-gram F1 59.5 # 2
EM - # 10
F1 - # 5

Methods


No methods listed for this paper. Add relevant methods here