# Gated-Attention Readers for Text Comprehension

Bhuwan Dhingra • Hanxiao Liu • Zhilin Yang • William W. Cohen • Ruslan Salakhutdinov

In this paper we study the problem of answering cloze-style questions over documents. Our model, the Gated-Attention (GA) Reader, integrates a multi-hop architecture with a novel attention mechanism, which is based on multiplicative interactions between the query embedding and the intermediate states of a recurrent neural network document reader. This enables the reader to build query-specific representations of tokens in the document for accurate answer selection.

Full paper