|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text).
#2 best model for Abstractive Text Summarization on CNN / Daily Mail
In this survey, we consider seq2seq problems from the RL point of view and provide a formulation combining the power of RL methods in decision-making with sequence-to-sequence models that enable remembering long-term memories.
Pre-training and fine-tuning, e. g., BERT, have achieved great success in language understanding by transferring knowledge from rich-resource pre-training task to the low/zero-resource downstream tasks.
SOTA for Text Summarization on GigaWord
Modern neural networks are often augmented with an attention mechanism, which tells the network where to focus within the input.
As part of this survey, we also develop an open source library, namely Neural Abstractive Text Summarizer (NATS) toolkit, for the abstractive text summarization.
Recurrent neural network models with an attention mechanism have proven to be extremely effective on a wide variety of sequence-to-sequence problems.
#14 best model for Speech Recognition on TIMIT