Gated-Attention Readers for Text Comprehension

ACL 2017 Bhuwan Dhingra • Hanxiao Liu • Zhilin Yang • William W. Cohen • Ruslan Salakhutdinov

In this paper we study the problem of answering cloze-style questions over documents. Our model, the Gated-Attention (GA) Reader, integrates a multi-hop architecture with a novel attention mechanism, which is based on multiplicative interactions between the query embedding and the intermediate states of a recurrent neural network document reader. This enables the reader to build query-specific representations of tokens in the document for accurate answer selection.

Full paper

Evaluation


Task Dataset Model Metric name Metric value Global rank Compare
Question Answering Children's Book Test GA reader Accuracy-CN 69.4% # 5
Question Answering Children's Book Test GA reader Accuracy-NE 71.9% # 5
Question Answering Children's Book Test GA + feature + fix L(w) Accuracy-CN 70.7% # 3
Question Answering Children's Book Test GA + feature + fix L(w) Accuracy-NE 74.9% # 3
Question Answering Children's Book Test NSE Accuracy-CN 71.9% # 2
Question Answering Children's Book Test NSE Accuracy-NE 73.2% # 2
Question Answering CNN / Daily Mail GA Reader CNN 77.9 # 2
Question Answering CNN / Daily Mail GA Reader Daily Mail 80.9 # 2
Open-Domain Question Answering Quasar GA EM (Quasar-T) 26.4 # 4
Open-Domain Question Answering Quasar GA F1 (Quasar-T) 26.4 # 4