KG-to-Text Generation
16 papers with code • 11 benchmarks • 9 datasets
Knowledge-graph-to-text (KG-to-text) generation aims to generate high-quality texts which are consistent with input graphs.
Description from: JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs
Most implemented papers
Text Generation from Knowledge Graphs with Graph Transformers
Generating texts which express complex ideas spanning multiple sentences requires a structured representation of their content (document plan), but these representations are prohibitively expensive to manually produce.
Investigating Pretrained Language Models for Graph-to-Text Generation
We show that the PLMs BART and T5 achieve new state-of-the-art results and that task-adaptive pretraining strategies improve their performance even further.
Deep Graph Convolutional Encoders for Structured Data to Text Generation
Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods.
Handling Rare Items in Data-to-Text Generation
Neural approaches to data-to-text generation generally handle rare input items using either delexicalisation or a copy mechanism.
Modeling Global and Local Node Contexts for Text Generation from Knowledge Graphs
Recent graph-to-text models generate text from graph-based data using either global or local aggregation to learn node representations.
Toward Subgraph-Guided Knowledge Graph Question Generation with Graph Neural Networks
In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers.
ENT-DESC: Entity Description Generation by Exploring Knowledge Graph
Previous works on knowledge-to-text generation take as input a few RDF triples or key-value pairs conveying the knowledge of some entities to generate a natural language description.
KGPT: Knowledge-Grounded Pre-Training for Data-to-Text Generation
We propose a knowledge-grounded pre-training (KGPT), which consists of two parts, 1) a general knowledge-grounded generation model to generate knowledge-enriched text.
How to Train Your Agent to Read and Write
Typically, this requires an agent to fully understand the knowledge from the given text materials and generate correct and fluent novel paragraphs, which is very challenging in practice.
Few-shot Knowledge Graph-to-Text Generation with Pretrained Language Models
This paper studies how to automatically generate a natural language text that describes the facts in knowledge graph (KG).