Deeper Task-Specificity Improves Joint Entity and Relation Extraction

15 Feb 2020 โ€ข Phil Crone

Multi-task learning (MTL) is an effective method for learning related tasks, but designing MTL models necessitates deciding which and how many parameters should be task-specific, as opposed to shared between tasks. We investigate this issue for the problem of jointly learning named entity recognition (NER) and relation extraction (RE) and propose a novel neural architecture that allows for deeper task-specificity than does prior work... (read more)

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Relation Extraction ADE Corpus Deeper RE+ Macro F1 83.74 # 1
NER Macro F1 89.48 # 2
Relation Extraction CoNLL04 Deeper NER Macro F1 87 # 1
RE+ Micro F1 71.08 # 3
RE+ Macro F1 72.63 # 3
NER Micro F1 89.78 # 2

Methods used in the Paper


METHOD TYPE
๐Ÿค– No Methods Found Help the community by adding them if they're not listed; e.g. Deep Residual Learning for Image Recognition uses ResNet