Get To The Point: Summarization with Pointer-Generator Networks

Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text). However, these models have two shortcomings: they are liable to reproduce factual details inaccurately, and they tend to repeat themselves. In this work we propose a novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways. First, we use a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator. Second, we use coverage to keep track of what has been summarized, which discourages repetition. We apply our model to the CNN / Daily Mail summarization task, outperforming the current abstractive state-of-the-art by at least 2 ROUGE points.

PDF Abstract ACL 2017 PDF ACL 2017 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Text Summarization arXiv Pntr-Gen-Seq2Seq ROUGE-1 32.06 # 20
Document Summarization CNN / Daily Mail Lead-3 ROUGE-1 40.34 # 15
ROUGE-2 17.70 # 16
ROUGE-L 36.57 # 17
Extractive Text Summarization CNN / Daily Mail Lead-3 baseline ROUGE-2 17.70 # 12
ROUGE-1 40.34 # 11
ROUGE-L 36.57 # 12
Abstractive Text Summarization CNN / Daily Mail Pointer-Generator + Coverage ROUGE-1 39.53 # 37
ROUGE-2 17.28 # 36
Text Summarization Pubmed Pntr-Gen-Seq2Seq ROUGE-1 35.86 # 21

Results from Other Papers


Task Dataset Model Metric Name Metric Value Rank Source Paper Compare
Abstractive Text Summarization CNN / Daily Mail PTGEN + Coverage ROUGE-1 39.53 # 37
ROUGE-2 17.28 # 36
ROUGE-L 36.38 # 38
ROUGE-1 39.53 # 37
ROUGE-2 17.28 # 36
ROUGE-L 36.38 # 38

Methods


No methods listed for this paper. Add relevant methods here