18 papers with code • 0 benchmarks • 0 datasets
Generating a summary of a given sentence.
These leaderboards are used to track progress in Sentence Summarization
Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build.
In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora.
Modern neural networks are often augmented with an attention mechanism, which tells the network where to focus within the input.
Recurrent neural network models with an attention mechanism have proven to be extremely effective on a wide variety of sequence-to-sequence problems.
Automatic sentence summarization produces a shorter version of a sentence, while preserving its most important information.
In this work we present an unsupervised approach to summarize sentences in abstractive way using Variational Autoencoder (VAE).