FlowQA: Grasping Flow in History for Conversational Machine Comprehension

ICLR 2019 Hsin-Yuan Huang • Eunsol Choi • Wen-tau Yih

To enable traditional, single-turn models to encode the history comprehensively, we introduce Flow, a mechanism that can incorporate intermediate representations generated during the process of answering previous questions, through an alternating parallel processing structure. Compared to shallow approaches that concatenate previous questions/answers as input, Flow integrates the latent semantics of the conversation history more deeply. By reducing sequential instruction understanding to conversational machine comprehension, FlowQA outperforms the best models on all three domains in SCONE, with +1.8% to +4.4% improvement in accuracy.

Full paper

Evaluation


Task Dataset Model Metric name Metric value Global rank Compare
Question Answering CoQA FlowQA (single model) In-domain 76.3 # 5
Question Answering CoQA FlowQA (single model) Out-of-domain 71.8 # 5
Question Answering CoQA FlowQA (single model) Overall 75.0 # 5
Question Answering QuAC FlowQA (single model) F1 64.1 # 1
Question Answering QuAC FlowQA (single model) HEQQ 59.6 # 1
Question Answering QuAC FlowQA (single model) HEQD 5.8 # 1