Joint Entity and Relation Extraction
37 papers with code • 6 benchmarks • 6 datasets
Scores reported from systems which jointly extract entities and relations.
Benchmarks
These leaderboards are used to track progress in Joint Entity and Relation Extraction
Datasets
Most implemented papers
Multi-Task Identification of Entities, Relations, and Coreference for Scientific Knowledge Graph Construction
We introduce a multi-task setup of identifying and classifying entities, relations, and coreference clusters in scientific articles.
A General Framework for Information Extraction using Dynamic Span Graphs
We introduce a general framework for several information extraction tasks that share span representations using dynamically constructed span graphs.
CoType: Joint Extraction of Typed Entities and Relations with Knowledge Bases
We propose a novel domain-independent framework, called CoType, that runs a data-driven text segmentation algorithm to extract entity mentions, and jointly embeds entity mentions, relation mentions, text features and type labels into two low-dimensional spaces (for entity and relation mentions respectively), where, in each space, objects whose types are close will also have similar representations.
Joint Extraction of Entities and Relations Based on a Novel Tagging Scheme
Joint extraction of entities and relations is an important task in information extraction.
Entity, Relation, and Event Extraction with Contextualized Span Representations
We examine the capabilities of a unified, multi-task framework for three information extraction tasks: named entity recognition, relation extraction, and event extraction.
Span-based Joint Entity and Relation Extraction with Transformer Pre-training
The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass.
Two are Better than One: Joint Entity and Relation Extraction with Table-Sequence Encoders
In this work, we propose the novel {\em table-sequence encoders} where two different encoders -- a table encoder and a sequence encoder are designed to help each other in the representation learning process.
A Frustratingly Easy Approach for Entity and Relation Extraction
Our approach essentially builds on two independent encoders and merely uses the entity model to construct the input for the relation model.
Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction
This paper proposes a novel context-aware joint entity and word-level relation extraction approach through semantic composition of words, introducing a Table Filling Multi-Task Recurrent Neural Network (TF-MTRNN) model that reduces the entity recognition and relation classification tasks to a table-filling problem and models their interdependencies.
Adversarial training for multi-context joint entity and relation extraction
Adversarial training (AT) is a regularization method that can be used to improve the robustness of neural network methods by adding small perturbations in the training data.