Concept-To-Text Generation

4 papers with code • 1 benchmarks • 1 datasets

Generating natural language text from a conceptualized representation, such as an ontology.

Latest papers with no code

Informed Sampling for Diversity in Concept-to-Text NLG

no code yet • Findings (EMNLP) 2021

In this work, we propose to ameliorate this cost by using an Imitation Learning approach to explore the level of diversity that a language generation model can reliably produce.

Enhancing Topic-to-Essay Generation with External Commonsense Knowledge

no code yet • ACL 2019

Experiments show that with external commonsense knowledge and adversarial training, the generated essays are more novel, diverse, and topic-consistent than existing methods in terms of both automatic and human evaluation.

Extracting Linguistic Resources from the Web for Concept-to-Text Generation

no code yet • 31 Oct 2018

Many concept-to-text generation systems require domain-specific linguistic resources to produce high quality texts, but manually constructing these resources can be tedious and costly.

Generating Texts with Integer Linear Programming

no code yet • 31 Oct 2018

Content selection, for example, may greedily select the most important facts, which may require, however, too many words to express, and this may be undesirable when space is limited or expensive.

Deep Attentive Sentence Ordering Network

no code yet • EMNLP 2018

In this paper, we propose a novel deep attentive sentence ordering network (referred as ATTOrderNet) which integrates self-attention mechanism with LSTMs in the encoding of input sentences.