About

Abstractive Text Summarization is the task of generating a short and concise summary that captures the salient ideas of the source text. The generated summaries potentially contain new phrases and sentences that may not appear in the source text.

Source: Generative Adversarial Network for Abstractive Text Summarization

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Subtasks

Datasets

Greatest papers with code

Attention Is All You Need

NeurIPS 2017 tensorflow/models

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

ABSTRACTIVE TEXT SUMMARIZATION CONSTITUENCY PARSING MACHINE TRANSLATION

ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training

13 Jan 2020huggingface/transformers

This paper presents a new sequence-to-sequence pre-training model called ProphetNet, which introduces a novel self-supervised objective named future n-gram prediction and the proposed n-stream self-attention mechanism.

Ranked #3 on Abstractive Text Summarization on CNN / Daily Mail (using extra training data)

ABSTRACTIVE TEXT SUMMARIZATION QUESTION GENERATION

PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization

ICML 2020 huggingface/transformers

Recent work pre-training Transformers with self-supervised objectives on large text corpora has shown great success when fine-tuned on downstream NLP tasks including text summarization.

ABSTRACTIVE TEXT SUMMARIZATION

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

ACL 2020 huggingface/transformers

We evaluate a number of noising approaches, finding the best performance by both randomly shuffling the order of the original sentences and using a novel in-filling scheme, where spans of text are replaced with a single mask token.

ABSTRACTIVE TEXT SUMMARIZATION DENOISING MACHINE TRANSLATION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING TEXT GENERATION

SummAE: Zero-Shot Abstractive Text Summarization using Length-Agnostic Auto-Encoders

2 Oct 2019google-research/google-research

We show results for extractive and human baselines to demonstrate a large abstractive gap in performance.

ABSTRACTIVE TEXT SUMMARIZATION DENOISING

Pre-trained Language Model Representations for Language Generation

NAACL 2019 pytorch/fairseq

Pre-trained language model representations have been successful in a wide range of language understanding tasks.

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION TEXT GENERATION

Pay Less Attention with Lightweight and Dynamic Convolutions

ICLR 2019 pytorch/fairseq

We predict separate convolution kernels based solely on the current time-step in order to determine the importance of context elements.

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION

Classical Structured Prediction Losses for Sequence to Sequence Learning

NAACL 2018 pytorch/fairseq

There has been much recent work on training neural attention models at the sequence-level using either reinforcement learning-style methods or by optimizing the beam.

ABSTRACTIVE TEXT SUMMARIZATION MACHINE TRANSLATION STRUCTURED PREDICTION

Better Fine-Tuning by Reducing Representational Collapse

ICLR 2021 pytorch/fairseq

Although widely adopted, existing approaches for fine-tuning pre-trained language models have been shown to be unstable across hyper-parameter settings, motivating recent work on trust region methods.

ABSTRACTIVE TEXT SUMMARIZATION CROSS-LINGUAL NATURAL LANGUAGE INFERENCE

Sample Efficient Text Summarization Using a Single Pre-Trained Transformer

21 May 2019tensorflow/tensor2tensor

Language model (LM) pre-training has resulted in impressive performance and sample efficiency on a variety of language understanding tasks.

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING