Relation Extraction
591 papers with code • 49 benchmarks • 68 datasets
Relation Extraction is the task of predicting attributes and relations for entities in a sentence. For example, given a sentence “Barack Obama was born in Honolulu, Hawaii.”, a relation classifier aims at predicting the relation of “bornInCity”. Relation Extraction is the key component for building relation knowledge graphs, and it is of crucial significance to natural language processing applications such as structured search, sentiment analysis, question answering, and summarization.
Source: Deep Residual Learning for Weakly-Supervised Relation Extraction
Libraries
Use these libraries to find Relation Extraction models and implementationsSubtasks
-
Relation Classification
-
Joint Entity and Relation Extraction
-
Document-level Relation Extraction
-
Temporal Relation Extraction
-
Temporal Relation Extraction
-
Dialog Relation Extraction
-
Relationship Extraction (Distant Supervised)
-
Continual Relation Extraction
-
Binary Relation Extraction
-
Zero-shot Relation Triplet Extraction
-
4-ary Relation Extraction
-
DrugProt
-
relation explanation
-
Hyper-Relational Extraction
-
Multi-Labeled Relation Extraction
-
Relation Mention Extraction
Most implemented papers
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Biomedical text mining is becoming increasingly important as the number of biomedical documents rapidly grows.
LayoutLM: Pre-training of Text and Layout for Document Image Understanding
In this paper, we propose the \textbf{LayoutLM} to jointly model interactions between text and layout information across scanned document images, which is beneficial for a great number of real-world document image understanding tasks such as information extraction from scanned documents.
Matching the Blanks: Distributional Similarity for Relation Learning
General purpose relation extractors, which can model arbitrary relations, are a core aspiration in information extraction.
LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention
In this paper, we propose new pretrained contextualized representations of words and entities based on the bidirectional transformer.
Simplifying Graph Convolutional Networks
Graph Convolutional Networks (GCNs) and their variants have experienced significant attention and have become the de facto methods for learning graph representations.
Joint entity recognition and relation extraction as a multi-head selection problem
State-of-the-art models for joint entity recognition and relation extraction strongly rely on external natural language processing (NLP) tools such as POS (part-of-speech) taggers and dependency parsers.
Enriching Pre-trained Language Model with Entity Information for Relation Classification
In this paper, we propose a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task.
SpanBERT: Improving Pre-training by Representing and Predicting Spans
We present SpanBERT, a pre-training method that is designed to better represent and predict spans of text.
Improving Distantly Supervised Relation Extraction using Word and Entity Based Attention
Relation extraction is the problem of classifying the relationship between two entities in a given sentence.
The Natural Language Decathlon: Multitask Learning as Question Answering
Though designed for decaNLP, MQAN also achieves state of the art results on the WikiSQL semantic parsing task in the single-task setting.