KB-to-Language Generation

3 papers with code • 3 benchmarks • 2 datasets

Given information from a knowledge base, generate a description of this information in natural language.

Greatest papers with code

Investigating Pretrained Language Models for Graph-to-Text Generation

UKPLab/plms-graph2text 16 Jul 2020

We show that the PLMs BART and T5 achieve new state-of-the-art results and that task-adaptive pretraining strategies improve their performance even further.

AMR-to-Text Generation Data-to-Text Generation +2

Stage-wise Fine-tuning for Graph-to-Text Generation

EagleW/Stage-wise-Fine-tuning 17 May 2021

Graph-to-text generation has benefited from pre-trained language models (PLMs) in achieving better performance than structured graph encoders.

 Ranked #1 on Data-to-Text Generation on WebNLG (using extra training data)

Data-to-Text Generation KB-to-Language Generation +1