May the Force Be with Your Copy Mechanism: Enhanced Supervised-Copy Method for Natural Language Generation

Recent neural sequence-to-sequence models with a copy mechanism have achieved remarkable progress in various text generation tasks. These models addressed out-of-vocabulary problems and facilitated the generation of rare words. However, the identification of the word which needs to be copied is difficult, as observed by prior copy models, which suffer from incorrect generation and lacking abstractness. In this paper, we propose a novel supervised approach of a copy network that helps the model decide which words need to be copied and which need to be generated. Specifically, we re-define the objective function, which leverages source sequences and target vocabularies as guidance for copying. The experimental results on data-to-text generation and abstractive summarization tasks verify that our approach enhances the copying quality and improves the degree of abstractness.

PDF Abstract arXiv 2021 PDF arXiv 2021 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Data-to-Text Generation MLB Dataset Force-Copy BLEU 10.5 # 4
Data-to-Text Generation MLB Dataset (Content Ordering) Force-Copy DLD 21.16 # 3
Data-to-Text Generation MLB Dataset (Content Selection) Force-Copy Precision 49.39 # 1
Recall 50.89 # 3
Data-to-Text Generation MLB Dataset (Relation Generation) Force-Copy Precision 84.50 # 3
count 21.05 # 4
Data-to-Text Generation RotoWire Force-Copy BLEU 17.26 # 3
Data-to-Text Generation RotoWire (Content Ordering) Force-Copy DLD 17.26% # 4
BLEU 15.8 # 3
Data-to-Text Generation Rotowire (Content Selection) Force-Copy Precision 34.34% # 2
Recall 48.85% # 4
Data-to-Text Generation RotoWire (Relation Generation) Force-Copy count 27.37 # 4
Precision 95.40% # 3

Methods


No methods listed for this paper. Add relevant methods here