650 papers with code • 13 benchmarks • 67 datasets
Text generation is the task of generating text with the goal of appearing indistinguishable to human-written text. This task if more formally known as "natural language generation" in the literature.
Text generation can be addressed with Markov processes or deep generative models like LSTMs. Recently, some of the most advanced methods for text generation include BART, GPT and other GAN-based approaches. Text generation systems are evaluated either through human ratings or automatic evaluation metrics like METEOR, ROUGE, and BLEU.
( Image credit: Adversarial Ranking for Language Generation )
- Dialogue Generation
- Data-to-Text Generation
- Multi-Document Summarization
- Text Style Transfer
- Text Style Transfer
- Paraphrase Generation
- Story Generation
- Spelling Correction
- Table-to-Text Generation
- Conditional Text Generation
- Visual Storytelling
- Text Infilling
- Distractor Generation
- Story Completion
- News Generation
- Paper generation
- Concept-To-Text Generation
- Sonnet Generation
- Fact-based Text Editing
- Natural Language Landmark Navigation Instructions Generation
We present a novel corpus of 445 human- and computer-generated documents, comprising about 27, 000 clauses, annotated for semantic clause types and coherence relations that allow for nuanced comparison of artificial and natural discourse modes.
Large, pre-trained transformer-based language models such as BERT have drastically changed the Natural Language Processing (NLP) field.
Extracting relation triplets from raw text is a crucial task in Information Extraction, enabling multiple applications such as populating or validating knowledge bases, factchecking, and other downstream tasks.
Ranked #1 on Relation Extraction on NYT (using extra training data)
Pretrained bidirectional Transformers, such as BERT, have achieved significant improvements in a wide variety of language understanding tasks, while it is not straightforward to directly apply them for natural language generation.
We adopt Transformer as our unified architecture for its strong performance and task-agnostic design.
Permutation invariant graph-to-sequence model for template-free retrosynthesis and reaction prediction
Synthesis planning and reaction outcome prediction are two fundamental problems in computer-aided organic chemistry for which a variety of data-driven approaches have emerged.