Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization

EACL 2017  ·  Jun Suzuki, Masaaki Nagata ·

This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.

PDF Abstract EACL 2017 PDF EACL 2017 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Text Summarization DUC 2004 Task 1 EndDec+WFE ROUGE-1 32.28 # 4
ROUGE-2 10.54 # 7
ROUGE-L 27.8 # 4
Text Summarization GigaWord EndDec+WFE ROUGE-1 36.30 # 29
ROUGE-2 17.31 # 31
ROUGE-L 33.88 # 28

Methods


No methods listed for this paper. Add relevant methods here