Span-based Joint Entity and Relation Extraction with Transformer Pre-training

17 Sep 2019  ·  Markus Eberts, Adrian Ulges ·

We introduce SpERT, an attention model for span-based joint entity and relation extraction. Our key contribution is a light-weight reasoning on BERT embeddings, which features entity recognition and filtering, as well as relation classification with a localized, marker-free context representation... The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass. These aspects facilitate a search over all spans in the sentence. In ablation studies, we demonstrate the benefits of pre-training, strong negative sampling and localized context. Our model outperforms prior work by up to 2.6% F1 score on several datasets for joint entity and relation extraction. read more

PDF Abstract

Datasets


Results from the Paper


 Ranked #1 on Named Entity Recognition on SciERC (using extra training data)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Relation Extraction ADE Corpus SpERT (without overlap) RE+ Macro F1 79.24 # 4
NER Macro F1 89.25 # 5
Relation Extraction ADE Corpus SpERT (with overlap) RE+ Macro F1 78.84 # 5
NER Macro F1 89.28 # 4
Relation Extraction CoNLL04 SpERT NER Macro F1 86.25 # 3
RE+ Micro F1 71.47 # 2
RE+ Macro F1 72.87 # 2
NER Micro F1 88.94 # 3
Named Entity Recognition SciERC SpERT F1 70.33 # 1
Joint Entity and Relation Extraction SciERC SpERT Entity F1 70.33 # 1
Joint Entity and Relation Extraction SciERC SpERT (with overlap) Relation F1 50.84 # 1

Methods