Fine-tune BERT for Extractive Summarization

arXiv 2019 Yang Liu

BERT, a pre-trained Transformer model, has achieved ground-breaking performance on multiple NLP tasks. In this paper, we describe BERTSUM, a simple variant of BERT, for extractive summarization... (read more)

PDF Abstract

Evaluation Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK COMPARE
Extractive Document Summarization CNN / Daily Mail BERTSUM ROUGE-2 20.24 # 2
Extractive Document Summarization CNN / Daily Mail BERTSUM ROUGE-1 43.25 # 2
Extractive Document Summarization CNN / Daily Mail BERTSUM ROUGE-L 39.63 # 2