Attention-over-Attention Neural Networks for Reading Comprehension

Cloze-style queries are representative problems in reading comprehension. Over the past few months, we have seen much progress that utilizing neural network approach to solve Cloze-style questions. In this paper, we present a novel model called attention-over-attention reader for the Cloze-style reading comprehension task. Our model aims to place another attention mechanism over the document-level attention, and induces "attended attention" for final predictions. Unlike the previous works, our neural network model requires less pre-defined hyper-parameters and uses an elegant architecture for modeling. Experimental results show that the proposed attention-over-attention model significantly outperforms various state-of-the-art systems by a large margin in public datasets, such as CNN and Children's Book Test datasets.

PDF Abstract ACL 2017 PDF ACL 2017 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Question Answering Children's Book Test AoA reader Accuracy-CN 69.4% # 3
Accuracy-NE 72% # 3
Question Answering CNN / Daily Mail AoA Reader CNN 74.4 # 8

Methods


No methods listed for this paper. Add relevant methods here