REBEL: Relation Extraction By End-to-end Language generation
Extracting relation triplets from raw text is a crucial task in Information Extraction, enabling multiple applications such as populating or validating knowledge bases, factchecking, and other downstream tasks. However, it usually involves multiple-step pipelines that propagate errors or are limited to a small number of relation types. To overcome these issues, we propose the use of autoregressive seq2seq models. Such models have previously been shown to perform well not only in language generation, but also in NLU tasks such as Entity Linking, thanks to their framing as seq2seq tasks. In this paper, we show how Relation Extraction can be simplified by expressing triplets as a sequence of text and we present REBEL, a seq2seq model based on BART that performs end-to-end relation extraction for more than 200 different relation types. We show our model's flexibility by fine-tuning it on an array of Relation Extraction and Relation Classification benchmarks, with it attaining state-of-the-art performance in most of them.
PDF Abstract Findings (EMNLP) 2021 PDF Findings (EMNLP) 2021 AbstractCode
Datasets
Introduced in the Paper:
REBELUsed in the Paper:


Results from the Paper
Ranked #1 on
Joint Entity and Relation Extraction
on DocRED
(using extra training data)
Task | Dataset | Model | Metric Name | Metric Value | Global Rank | Uses Extra Training Data |
Benchmark |
---|---|---|---|---|---|---|---|
Relation Extraction | Adverse Drug Events (ADE) Corpus | REBEL (including overlapping entities) | RE+ Macro F1 | 82.2 | # 6 | ||
Relation Extraction | CoNLL04 | REBEL | RE+ Micro F1 | 75.4 | # 3 | ||
RE+ Macro F1 | 76.65 | # 1 | |||||
Joint Entity and Relation Extraction | DocRED | REBEL+pretraining | Relation F1 | 47.1 | # 1 | ||
Joint Entity and Relation Extraction | DocRED | REBEL | Relation F1 | 41.8 | # 2 | ||
Relation Extraction | NYT | REBEL (no pre-training) | F1 | 93.1 | # 5 | ||
Relation Extraction | Re-TACRED | REBEL (no entity type marker) | F1 | 90.4 | # 4 |