The Summary Loop: Learning to Write Abstractive Summaries Without Examples

This work presents a new approach to unsupervised abstractive summarization based on maximizing a combination of coverage and fluency for a given length constraint. It introduces a novel method that encourages the inclusion of key terms from the original document into the summary: key terms are masked out of the original document and must be filled in by a coverage model using the current generated summary. A novel unsupervised training procedure leverages this coverage model along with a fluency model to generate and score summaries. When tested on popular news summarization datasets, the method outperforms previous unsupervised methods by more than 2 R-1 points, and approaches results of competitive supervised methods. Our model attains higher levels of abstraction with copied passages roughly two times shorter than prior work, and learns to compress and merge sentences without supervision.

PDF Abstract ACL 2020 PDF ACL 2020 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Abstractive Text Summarization CNN / Daily Mail Summary Loop Unsup ROUGE-1 37.7 # 51

Methods


No methods listed for this paper. Add relevant methods here