A Question-Focused Multi-Factor Attention Network for Question Answering

25 Jan 2018  ·  Souvik Kundu, Hwee Tou Ng ·

Neural network models recently proposed for question answering (QA) primarily focus on capturing the passage-question relation. However, they have minimal capability to link relevant facts distributed across multiple sentences which is crucial in achieving deeper understanding, such as performing multi-sentence reasoning, co-reference resolution, etc. They also do not explicitly focus on the question and answer type which often plays a critical role in QA. In this paper, we propose a novel end-to-end question-focused multi-factor attention network for answer extraction. Multi-factor attentive encoding using tensor-based transformation aggregates meaningful facts even when they are located in multiple sentences. To implicitly infer the answer type, we also propose a max-attentional question aggregation mechanism to encode a question vector based on the important words in a question. During prediction, we incorporate sequence-level encoding of the first wh-word and its immediately following word as an additional source of question type information. Our proposed model achieves significant improvements over the best prior state-of-the-art results on three large-scale challenging QA datasets, namely NewsQA, TriviaQA, and SearchQA.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Question Answering NewsQA AMANDA F1 63.7 # 6
EM 48.4 # 4
Open-Domain Question Answering SearchQA AMANDA Unigram Acc 46.8 # 3
N-gram F1 56.6 # 3
EM - # 10
F1 - # 5

Methods


No methods listed for this paper. Add relevant methods here