Question Directed Graph Attention Network for Numerical Reasoning over Text

Numerical reasoning over texts, such as addition, subtraction, sorting and counting, is a challenging machine reading comprehension task, since it requires both natural language understanding and arithmetic computation. To address this challenge, we propose a heterogeneous graph representation for the context of the passage and question needed for such reasoning, and design a question directed graph attention network to drive multi-step numerical reasoning over this context graph. The code link is at: https://github.com/emnlp2020qdgat/QDGAT

PDF Abstract EMNLP 2020 PDF EMNLP 2020 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Question Answering DROP Test QDGAT (ensemble) F1 88.38 # 1

Methods


No methods listed for this paper. Add relevant methods here