|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build.
#9 best model for Text Summarization on DUC 2004 Task 1
In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora.
#8 best model for Text Summarization on GigaWord
Recurrent neural network models with an attention mechanism have proven to be extremely effective on a wide variety of sequence-to-sequence problems.
#14 best model for Speech Recognition on TIMIT
We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization.
We propose a contrastive attention mechanism to extend the sequence-to-sequence framework for abstractive sentence summarization task, which aims to generate a brief summary of a given source sentence.
In this work we present an unsupervised approach to summarize sentences in abstractive way using Variational Autoencoder (VAE).