Data-to-Text Generation with Content Selection and Planning

3 Sep 2018 Ratish Puduppully Li Dong Mirella Lapata

Recent advances in data-to-text generation have led to the use of large-scale datasets and neural network models which are trained end-to-end, without explicitly modeling what to say and in what order. In this work, we present a neural network architecture which incorporates content selection and planning without sacrificing end-to-end training... (read more)

PDF Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Data-to-Text Generation RotoWire Neural Content Planning + conditional copy BLEU 16.50 # 2
Data-to-Text Generation RotoWire (Content Ordering) Neural Content Planning + conditional copy DLD 18.58% # 2
Data-to-Text Generation Rotowire (Content Selection) Neural Content Planning + conditional copy Precision 34.18% # 2
Recall 51.22% # 2
Data-to-Text Generation RotoWire (Relation Generation) Neural Content Planning + conditional copy count 34.28 # 1
Precision 87.47% # 2

Methods used in the Paper


METHOD TYPE
🤖 No Methods Found Help the community by adding them if they're not listed; e.g. Deep Residual Learning for Image Recognition uses ResNet