Scores reported from systems which jointly extract entities and relations.
|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
We propose a novel domain-independent framework, called CoType, that runs a data-driven text segmentation algorithm to extract entity mentions, and jointly embeds entity mentions, relation mentions, text features and type labels into two low-dimensional spaces (for entity and relation mentions respectively), where, in each space, objects whose types are close will also have similar representations.
Adversarial training (AT) is a regularization method that can be used to improve the robustness of neural network methods by adding small perturbations in the training data.
#3 best model for Relation Extraction on ADE Corpus
In contrast to previous baselines, we consider the interaction between named entities and relations via a 2nd-phase relation-weighted GCN to better extract relations.
#6 best model for Relation Extraction on WebNLG
The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass.
SOTA for Relation Extraction on CoNLL04
We examine the capabilities of a unified, multi-task framework for three information extraction tasks: named entity recognition, relation extraction, and event extraction.
#2 best model for Joint Entity and Relation Extraction on SciERC
We introduce a general framework for several information extraction tasks that share span representations using dynamically constructed span graphs.
SOTA for Relation Extraction on ACE 2005
We introduce a multi-task setup of identifying and classifying entities, relations, and coreference clusters in scientific articles.
#4 best model for Joint Entity and Relation Extraction on SciERC
A relation tuple consists of two entities and the relation between them, and often such tuples are found in unstructured text.
#4 best model for Relation Extraction on NYT
This paper proposes a novel context-aware joint entity and word-level relation extraction approach through semantic composition of words, introducing a Table Filling Multi-Task Recurrent Neural Network (TF-MTRNN) model that reduces the entity recognition and relation classification tasks to a table-filling problem and models their interdependencies.