Improving Neural Abstractive Document Summarization with Structural Regularization

EMNLP 2018  ·  Wei Li, Xinyan Xiao, Yajuan Lyu, Yuanzhuo Wang ·

Recent neural sequence-to-sequence models have shown significant progress on short text summarization. However, for document summarization, they fail to capture the long-term structure of both documents and multi-sentence summaries, resulting in information loss and repetitions. In this paper, we propose to leverage the structural information of both documents and multi-sentence summaries to improve the document summarization performance. Specifically, we import both structural-compression and structural-coverage regularization into the summarization process in order to capture the information compression and information coverage properties, which are the two most important structural properties of document summarization. Experimental results demonstrate that the structural regularization improves the document summarization performance significantly, which enables our model to generate more informative and concise summaries, and thus significantly outperforms state-of-the-art neural abstractive methods.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Abstractive Text Summarization CNN / Daily Mail Li et al. ROUGE-1 40.30 # 43
ROUGE-2 18.02 # 38
ROUGE-L 37.36 # 39

Methods


No methods listed for this paper. Add relevant methods here