Question-Generation
197 papers with code • 1 benchmarks • 1 datasets
Benchmarks
These leaderboards are used to track progress in Question-Generation
Trend | Dataset | Best Model | Paper | Code | Compare |
---|
Libraries
Use these libraries to find Question-Generation models and implementationsMost implemented papers
Diverse Beam Search: Decoding Diverse Solutions from Neural Sequence Models
We observe that our method consistently outperforms BS and previously proposed techniques for diverse decoding from neural sequence models.
Learning to Ask: Neural Question Generation for Reading Comprehension
We study automatic question generation for sentences from text passages in reading comprehension.
Unified Language Model Pre-training for Natural Language Understanding and Generation
This paper presents a new Unified pre-trained Language Model (UniLM) that can be fine-tuned for both natural language understanding and generation tasks.
Neural Question Generation from Text: A Preliminary Study
Automatic question generation aims to generate questions from a text passage where the generated questions can be answered by certain sub-spans of the given passage.
Machine Comprehension by Text-to-Text Neural Question Generation
We propose a recurrent neural model that generates natural-language questions from documents, conditioned on answers.
Synthetic QA Corpora Generation with Roundtrip Consistency
We introduce a novel method of generating synthetic question answering corpora by combining models of question generation and answer extraction, and by filtering the results to ensure roundtrip consistency.
ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training
This paper presents a new sequence-to-sequence pre-training model called ProphetNet, which introduces a novel self-supervised objective named future n-gram prediction and the proposed n-stream self-attention mechanism.
ERNIE-GEN: An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation
Current pre-training works in natural language generation pay little attention to the problem of exposure bias on downstream tasks.
Simplifying Paragraph-level Question Generation via Transformer Language Models
Question generation (QG) is a natural language generation task where a model is trained to ask questions corresponding to some input text.
Exploring Models and Data for Image Question Answering
A suite of baseline results on this new dataset are also presented.