About

Open-domain question answering is the task of question answering on open-domain datasets such as Wikipedia.

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Datasets

Greatest papers with code

Dense Passage Retrieval for Open-Domain Question Answering

EMNLP 2020 huggingface/transformers

Open-domain question answering relies on efficient passage retrieval to select candidate contexts, where traditional sparse vector space models, such as TF-IDF or BM25, are the de facto method.

Ranked #6 on Question Answering on TriviaQA (F1 metric)

OPEN-DOMAIN QUESTION ANSWERING

Reformer: The Efficient Transformer

ICLR 2020 huggingface/transformers

Large Transformer models routinely achieve state-of-the-art results on a number of tasks but training these models can be prohibitively costly, especially on long sequences.

LANGUAGE MODELLING OPEN-DOMAIN QUESTION ANSWERING

Knowledge Guided Text Retrieval and Reading for Open Domain Question Answering

10 Nov 2019huggingface/transformers

We introduce an approach for open-domain question answering (QA) that retrieves and reads a passage graph, where vertices are passages of text and edges represent relationships that are derived from an external knowledge base or co-occurrence in the same article.

OPEN-DOMAIN QUESTION ANSWERING READING COMPREHENSION TEXT MATCHING

Reading Wikipedia to Answer Open-Domain Questions

ACL 2017 facebookresearch/ParlAI

This paper proposes to tackle open- domain question answering using Wikipedia as the unique knowledge source: the answer to any factoid question is a text span in a Wikipedia article.

OPEN-DOMAIN QUESTION ANSWERING READING COMPREHENSION

End-to-End Training of Neural Retrievers for Open-Domain Question Answering

2 Jan 2021NVIDIA/Megatron-LM

We also explore two approaches for end-to-end supervised training of the reader and retriever components in OpenQA models.

OPEN-DOMAIN QUESTION ANSWERING UNSUPERVISED PRE-TRAINING

REALM: Retrieval-Augmented Language Model Pre-Training

10 Feb 2020deepset-ai/haystack

Language model pre-training has been shown to capture a surprising amount of world knowledge, crucial for NLP tasks such as question answering.

4 LANGUAGE MODELLING OPEN-DOMAIN QUESTION ANSWERING

Bidirectional Attention Flow for Machine Comprehension

5 Nov 2016allenai/bi-att-flow

Machine comprehension (MC), answering a query about a given context paragraph, requires modeling complex interactions between the context and the query.

OPEN-DOMAIN QUESTION ANSWERING READING COMPREHENSION

Generating Long Sequences with Sparse Transformers

Preprint 2019 openai/sparse_attention

Transformers are powerful sequence models, but require time and memory that grows quadratically with the sequence length.

IMAGE GENERATION LANGUAGE MODELLING OPEN-DOMAIN QUESTION ANSWERING

Latent Retrieval for Weakly Supervised Open Domain Question Answering

ACL 2019 google-research/language

We show for the first time that it is possible to jointly learn the retriever and reader from question-answer string pairs and without any IR system.

INFORMATION RETRIEVAL OPEN-DOMAIN QUESTION ANSWERING