Relation Extraction
463 papers with code • 40 benchmarks • 55 datasets
Relation Extraction is the task of predicting attributes and relations for entities in a sentence. For example, given a sentence “Barack Obama was born in Honolulu, Hawaii.”, a relation classifier aims at predicting the relation of “bornInCity”. Relation Extraction is the key component for building relation knowledge graphs, and it is of crucial significance to natural language processing applications such as structured search, sentiment analysis, question answering, and summarization.
Source: Deep Residual Learning for Weakly-Supervised Relation Extraction
Libraries
Use these libraries to find Relation Extraction models and implementationsSubtasks
Most implemented papers
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Biomedical text mining is becoming increasingly important as the number of biomedical documents rapidly grows.
Matching the Blanks: Distributional Similarity for Relation Learning
General purpose relation extractors, which can model arbitrary relations, are a core aspiration in information extraction.
Simplifying Graph Convolutional Networks
Graph Convolutional Networks (GCNs) and their variants have experienced significant attention and have become the de facto methods for learning graph representations.
LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention
In this paper, we propose new pretrained contextualized representations of words and entities based on the bidirectional transformer.
Joint entity recognition and relation extraction as a multi-head selection problem
State-of-the-art models for joint entity recognition and relation extraction strongly rely on external natural language processing (NLP) tools such as POS (part-of-speech) taggers and dependency parsers.
Improving Distantly Supervised Relation Extraction using Word and Entity Based Attention
Relation extraction is the problem of classifying the relationship between two entities in a given sentence.
The Natural Language Decathlon: Multitask Learning as Question Answering
Though designed for decaNLP, MQAN also achieves state of the art results on the WikiSQL semantic parsing task in the single-task setting.
Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity Typing
Our model not only utilizes entities and their latent types as features effectively but also is more interpretable by visualizing attention mechanisms applied to our model and results of LET.
Enriching Pre-trained Language Model with Entity Information for Relation Classification
In this paper, we propose a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task.
SpanBERT: Improving Pre-training by Representing and Predicting Spans
We present SpanBERT, a pre-training method that is designed to better represent and predict spans of text.