Selective Encoding for Abstractive Sentence Summarization

ACL 2017  ·  Qingyu Zhou, Nan Yang, Furu Wei, Ming Zhou ·

We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization. It consists of a sentence encoder, a selective gate network, and an attention equipped decoder. The sentence encoder and decoder are built with recurrent neural networks. The selective gate network constructs a second level sentence representation by controlling the information flow from encoder to decoder. The second level representation is tailored for sentence summarization task, which leads to better performance. We evaluate our model on the English Gigaword, DUC 2004 and MSR abstractive sentence summarization datasets. The experimental results show that the proposed selective encoding model outperforms the state-of-the-art baseline models.

PDF Abstract ACL 2017 PDF ACL 2017 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Text Summarization DUC 2004 Task 1 SEASS ROUGE-1 29.21 # 8
ROUGE-2 9.56 # 9
ROUGE-L 25.51 # 7
Text Summarization GigaWord SEASS ROUGE-1 36.15 # 32
ROUGE-2 17.54 # 30
ROUGE-L 33.63 # 31

Methods


No methods listed for this paper. Add relevant methods here