ReasonBERT: Pre-trained to Reason with Distant Supervision

EMNLP 2021  ·  Xiang Deng, Yu Su, Alyssa Lees, You Wu, Cong Yu, Huan Sun ·

We present ReasonBert, a pre-training method that augments language models with the ability to reason over long-range relations and multiple, possibly hybrid contexts. Unlike existing pre-training methods that only harvest learning signals from local contexts of naturally occurring texts, we propose a generalized notion of distant supervision to automatically connect multiple pieces of text and tables to create pre-training examples that require long-range reasoning. Different types of reasoning are simulated, including intersecting multiple pieces of evidence, bridging from one piece of evidence to another, and detecting unanswerable cases. We conduct a comprehensive evaluation on a variety of extractive question answering datasets ranging from single-hop to multi-hop and from text-only to table-only to hybrid that require various reasoning capabilities and show that ReasonBert achieves remarkable improvement over an array of strong baselines. Few-shot experiments further demonstrate that our pre-training method substantially improves sample efficiency.

PDF Abstract EMNLP 2021 PDF EMNLP 2021 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Semantic Parsing GraphQuestions ReasonBERTR F1 Score 41.3 # 1
Question Answering TriviaQA ReasonBERTR F1 45.5 # 11
Question Answering TriviaQA ReasonBERTB F1 37.2 # 12

Methods