Deep Recurrent Generative Decoder for Abstractive Text Summarization

EMNLP 2017  ·  Piji Li, Wai Lam, Lidong Bing, ZiHao Wang ·

We propose a new framework for abstractive text summarization based on a sequence-to-sequence oriented encoder-decoder model equipped with a deep recurrent generative decoder (DRGN). Latent structure information implied in the target summaries is learned based on a recurrent latent random model for improving the summarization quality. Neural variational inference is employed to address the intractable posterior inference for the recurrent latent variables. Abstractive summaries are generated based on both the generative latent variables and the discriminative deterministic states. Extensive experiments on some benchmark datasets in different languages show that DRGN achieves improvements over the state-of-the-art methods.

PDF Abstract EMNLP 2017 PDF EMNLP 2017 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Text Summarization DUC 2004 Task 1 DRGD ROUGE-1 31.79 # 5
ROUGE-2 10.75 # 6
ROUGE-L 27.48 # 6
Text Summarization GigaWord DRGD ROUGE-1 36.27 # 31
ROUGE-2 17.57 # 29
ROUGE-L 33.62 # 33

Methods


No methods listed for this paper. Add relevant methods here