BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

29 Oct 2019Mike LewisYinhan LiuNaman GoyalMarjan GhazvininejadAbdelrahman MohamedOmer LevyVes StoyanovLuke Zettlemoyer

We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text... (read more)

PDF Abstract

Evaluation Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK COMPARE
Question Answering SQuAD1.1 dev BART Base (with text infilling) F1 90.8 # 8