Neural Extractive Text Summarization with Syntactic Compression

IJCNLP 2019  ·  Jiacheng Xu, Greg Durrett ·

Recent neural network approaches to summarization are largely either selection-based extraction or generation-based abstraction. In this work, we present a neural model for single-document summarization based on joint extraction and syntactic compression... Our model chooses sentences from the document, identifies possible compressions based on constituency parses, and scores those compressions with a neural model to produce the final summary. For learning, we construct oracle extractive-compressive summaries, then learn both of our components jointly with this supervision. Experimental results on the CNN/Daily Mail and New York Times datasets show that our model achieves strong performance (comparable to state-of-the-art systems) as evaluated by ROUGE. Moreover, our approach outperforms an off-the-shelf compression module, and human and manual evaluation shows that our model's output generally remains grammatical. read more

PDF Abstract IJCNLP 2019 PDF IJCNLP 2019 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Extractive Text Summarization CNN / Daily Mail HAHSum ROUGE-2 21.30 # 1
ROUGE-1 44.68 # 1
ROUGE-L 40.75 # 1


No methods listed for this paper. Add relevant methods here