Ruminating Reader: Reasoning with Gated Multi-Hop Attention

WS 2018  ·  Yichen Gong, Samuel R. Bowman ·

To answer the question in machine comprehension (MC) task, the models need to establish the interaction between the question and the context. To tackle the problem that the single-pass model cannot reflect on and correct its answer, we present Ruminating Reader. Ruminating Reader adds a second pass of attention and a novel information fusion component to the Bi-Directional Attention Flow model (BiDAF). We propose novel layer structures that construct an query-aware context vector representation and fuse encoding representation with intermediate representation on top of BiDAF model. We show that a multi-hop attention mechanism can be applied to a bi-directional attention structure. In experiments on SQuAD, we find that the Reader outperforms the BiDAF baseline by a substantial margin, and matches or surpasses the performance of all other published systems.

PDF Abstract WS 2018 PDF WS 2018 Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Question Answering SQuAD1.1 Ruminating Reader (single model) EM 70.639 # 159
F1 79.456 # 162
Question Answering SQuAD1.1 dev Ruminating Reader EM 70.6 # 36
F1 79.5 # 38

Methods


No methods listed for this paper. Add relevant methods here