Data-to-text Generation with Variational Sequential Planning

28 Feb 2022  ·  Ratish Puduppully, Yao Fu, Mirella Lapata ·

We consider the task of data-to-text generation, which aims to create textual output from non-linguistic input. We focus on generating long-form text, i.e., documents with multiple paragraphs, and propose a neural model enhanced with a planning component responsible for organizing high-level information in a coherent and meaningful way. We infer latent plans sequentially with a structured variational model, while interleaving the steps of planning and generation. Text is generated by conditioning on previous variational decisions and previously generated text. Experiments on two data-to-text benchmarks (RotoWire and MLB) show that our model outperforms strong baselines and is sample efficient in the face of limited training data (e.g., a few hundred instances).

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Data-to-Text Generation MLB Dataset SeqPlan BLEU 14.29 # 1
Data-to-Text Generation MLB Dataset (Content Ordering) SeqPlan DLD 22.7 # 1
Data-to-Text Generation MLB Dataset (Content Selection) SeqPlan Precision 43.3 # 2
Recall 53.5 # 2
Data-to-Text Generation MLB Dataset (Relation Generation) SeqPlan Precision 95.9 # 1
count 28.9 # 2
Data-to-Text Generation RotoWire (Relation Generation) SeqPlan count 46.7 # 1
Precision 97.6 # 1

Methods


No methods listed for this paper. Add relevant methods here