Browse > Natural Language Processing > Reading Comprehension

Reading Comprehension

149 papers with code ยท Natural Language Processing

Most current question answering datasets frame the task as reading comprehension where the question is about a paragraph or document and the answer often is a span in the document. The Machine Reading group at UCL also provides an overview of reading comprehension tasks.

Leaderboards

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Latest papers without code

Undersensitivity in Neural Reading Comprehension

ICLR 2020

Neural reading comprehension models have recently achieved impressive gener- alisation results, yet still perform poorly when given adversarially selected input.

ADVERSARIAL ATTACK DATA AUGMENTATION READING COMPREHENSION

Incorporating BERT into Neural Machine Translation

ICLR 2020

The recently proposed BERT~\citep{devlin2018bert} has shown great power on a variety of natural language understanding tasks, such as text classification, reading comprehension, etc.

READING COMPREHENSION TEXT CLASSIFICATION UNSUPERVISED MACHINE TRANSLATION

GraphFlow: Exploiting Conversation Flow with Graph Neural Networks for Conversational Machine Comprehension

ICLR 2020

In addition, visualization experiments show that our proposed model can better mimic the human reasoning process for conversational MC compared to existing models.

READING COMPREHENSION

ReClor: A Reading Comprehension Dataset Requiring Logical Reasoning

ICLR 2020

Empirical results show that the state-of-the-art models have an outstanding ability to capture biases contained in the dataset with high accuracy on EASY set.

READING COMPREHENSION

ASGen: Answer-containing Sentence Generation to Pre-Train Question Generator for Scale-up Data in Question Answering

ICLR 2020

We evaluate the question generation capability of our method by comparing the BLEU score with existing methods and test our method by fine-tuning the MRC model on the downstream MRC data after training on synthetic data.

LANGUAGE MODELLING MACHINE READING COMPREHENSION QUESTION ANSWERING QUESTION GENERATION

NeurQuRI: Neural Question Requirement Inspector for Answerability Prediction in Machine Reading Comprehension

ICLR 2020

Real-world question answering systems often retrieve potentially relevant documents to a given question through a keyword search, followed by a machine reading comprehension (MRC) step to find the exact answer from them.

MACHINE READING COMPREHENSION QUESTION ANSWERING

Neural Symbolic Reader: Scalable Integration of Distributed and Symbolic Representations for Reading Comprehension

ICLR 2020

Integrating distributed representations with symbolic operations is essential for reading comprehension requiring complex reasoning, such as counting, sorting and arithmetics, but most existing approaches are hard to scale to more domains or more complex reasoning.

DATA AUGMENTATION READING COMPREHENSION

An Exploration of Data Augmentation and Sampling Techniques for Domain-Agnostic Question Answering

WS 2019

To produce a domain-agnostic question answering model for the Machine Reading Question Answering (MRQA) 2019 Shared Task, we investigate the relative benefits of large pre-trained language models, various data sampling strategies, as well as query and context paraphrases generated by back-translation.

DATA AUGMENTATION QUESTION ANSWERING READING COMPREHENSION

Integrate Image Representation to Text Model on Sentence Level: a Semi-supervised Framework

1 Dec 2019

Integrating visual features has been proved useful in language representation learning.

READING COMPREHENSION REPRESENTATION LEARNING