Learning Dense Representations of Phrases at Scale

ACL 2021  ·  Jinhyuk Lee, Mujeen Sung, Jaewoo Kang, Danqi Chen ·

Open-domain question answering can be reformulated as a phrase retrieval problem, without the need for processing documents on-demand during inference (Seo et al., 2019). However, current phrase retrieval models heavily depend on sparse representations and still underperform retriever-reader approaches. In this work, we show for the first time that we can learn dense representations of phrases alone that achieve much stronger performance in open-domain QA. We present an effective method to learn phrase representations from the supervision of reading comprehension tasks, coupled with novel negative sampling methods. We also propose a query-side fine-tuning strategy, which can support transfer learning and reduce the discrepancy between training and inference. On five popular open-domain QA datasets, our model DensePhrases improves over previous phrase retrieval models by 15%-25% absolute accuracy and matches the performance of state-of-the-art retriever-reader models. Our model is easy to parallelize due to pure dense representations and processes more than 10 questions per second on CPUs. Finally, we directly use our pre-indexed dense phrase representations for two slot filling tasks, showing the promise of utilizing DensePhrases as a dense knowledge base for downstream tasks.

PDF Abstract ACL 2021 PDF ACL 2021 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Slot Filling KILT: T-REx DensePhrases KILT-AC 27.84 # 8
R-Prec 37.62 # 12
Recall@5 40.07 # 12
Accuracy 53.9 # 10
F1 61.74 # 9
KILT-F1 32.34 # 8
Slot Filling KILT: Zero Shot RE DensePhrases KILT-AC 41.34 # 7
R-Prec 57.43 # 12
Recall@5 60.47 # 12
Accuracy 47.42 # 7
F1 54.75 # 7
KILT-F1 46.79 # 7
Question Answering Natural Questions (long) DensePhrases F1 79.6 # 1
EM 71.9 # 1
Question Answering SQuAD1.1 dev DensePhrases EM 78.3 # 19
F1 86.3 # 20

Methods


No methods listed for this paper. Add relevant methods here