BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text... (read more)

PDF Abstract ACL 2020 PDF ACL 2020 Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Abstractive Text Summarization CNN / Daily Mail BART ROUGE-1 44.16 # 5
ROUGE-2 21.28 # 5
ROUGE-L 40.90 # 6
Question Answering SQuAD1.1 dev BART Base (with text infilling) F1 90.8 # 11
Text Summarization X-Sum BART ROUGE-1 45.14 # 2
ROUGE-2 22.27 # 2
ROUGE-3 37.25 # 2

Methods used in the Paper


METHOD TYPE
Denoising Autoencoder
Generative Models
AutoEncoder
Generative Models
Residual Connection
Skip Connections
BART
Transformers