Conditional Text Generation
24 papers with code • 1 benchmarks • 4 datasets
The task of generating text according to some pre-specified conditioning (e.g. topic or sentiment or constraint)
These leaderboards are used to track progress in Conditional Text Generation
Most implemented papers
Pragmatically Informative Text Generation
We improve the informativeness of models for conditional text generation using techniques from computational pragmatics.
Extract, Denoise and Enforce: Evaluating and Improving Concept Preservation for Text-to-Text Generation
In this paper, we present a systematic analysis that studies whether current seq2seq models, especially pre-trained language models, are good enough for preserving important input concepts and to what extent explicitly guiding generation with the concepts as lexical constraints is beneficial.
GENIUS: Sketch-based Language Model Pre-training via Extreme and Selective Masking for Text Generation and Augmentation
We introduce GENIUS: a conditional text generation model using sketches as input, which can fill in the missing contexts for a given sketch (key information consisting of textual spans, phrases, or words, concatenated by mask tokens).
Generating Text through Adversarial Training using Skip-Thought Vectors
Attempts have been made to utilize GANs with word embeddings for text generation.
Encoder-Agnostic Adaptation for Conditional Language Generation
Large pretrained language models have changed the way researchers approach discriminative natural language understanding tasks, leading to the dominance of approaches that adapt a pretrained model for arbitrary downstream tasks.
Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders
Conditional Text Generation has drawn much attention as a topic of Natural Language Generation (NLG) which provides the possibility for humans to control the properties of generated contents.
ToTTo: A Controlled Table-To-Text Generation Dataset
We present ToTTo, an open-domain English table-to-text dataset with over 120, 000 training examples that proposes a controlled generation task: given a Wikipedia table and a set of highlighted table cells, produce a one-sentence description.
Token Manipulation Generative Adversarial Network for Text Generation
MaskGAN opens the query for the conditional language model by filling in the blanks between the given tokens.
ETC-NLG: End-to-end Topic-Conditioned Natural Language Generation
We first test the effectiveness of our approach in a low-resource setting for Italian, evaluating the conditioning for both topic models and gold annotations.
Plug and Play Autoencoders for Conditional Text Generation
Text autoencoders are commonly used for conditional generation tasks such as style transfer.