A Neural Attention Model for Abstractive Sentence Summarization

EMNLP 2015  Â·  Alexander M. Rush, Sumit Chopra, Jason Weston ·

Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.

PDF Abstract EMNLP 2015 PDF EMNLP 2015 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Extractive Text Summarization DUC 2004 Task 1 Abs ROUGE-1 26.55 # 1
ROUGE-2 7.06 # 1
ROUGE-L 22.05 # 1
Text Summarization DUC 2004 Task 1 Abs+ ROUGE-1 28.18 # 11
ROUGE-2 8.49 # 11
ROUGE-L 23.81 # 11
Text Summarization DUC 2004 Task 1 ABS ROUGE-L 22.05 # 12
Text Summarization GigaWord Abs+ ROUGE-1 31 # 37
Text Summarization GigaWord Abs ROUGE-1 30.88 # 38

Methods


No methods listed for this paper. Add relevant methods here