Relation Extraction

309 papers with code • 27 benchmarks • 37 datasets

Relation Extraction is the task of predicting attributes and relations for entities in a sentence. For example, given a sentence “Barack Obama was born in Honolulu, Hawaii.”, a relation classifier aims at predicting the relation of “bornInCity”. Relation Extraction is the key component for building relation knowledge graphs, and it is of crucial significance to natural language processing applications such as structured search, sentiment analysis, question answering, and summarization.

Source: Deep Residual Learning for Weakly-Supervised Relation Extraction

Greatest papers with code

Stanza: A Python Natural Language Processing Toolkit for Many Human Languages

stanfordnlp/stanza ACL 2020

We introduce Stanza, an open-source Python natural language processing toolkit supporting 66 human languages.

Coreference Resolution Dependency Parsing +4

Manual Evaluation Matters: Reviewing Test Protocols of Distantly Supervised Relation Extraction

thunlp/OpenNRE 20 May 2021

Distantly supervised (DS) relation extraction (RE) has attracted much attention in the past few years as it can utilize large-scale auto-labeled data.

Relation Extraction

OpenNRE: An Open and Extensible Toolkit for Neural Relation Extraction

thunlp/OpenNRE IJCNLP 2019

OpenNRE is an open-source and extensible toolkit that provides a unified framework to implement neural models for relation extraction (RE).

Information Retrieval Question Answering +1

BioMegatron: Larger Biomedical Domain Language Model

NVIDIA/NeMo EMNLP 2020

There has been an influx of biomedical domain-specific language models, showing language models pre-trained on biomedical text perform better on biomedical domain benchmarks than those trained on general domain text corpora such as Wikipedia and Books.

Language Modelling Named Entity Recognition +2

Knowledge Representation Learning: A Quantitative Review

thunlp/OpenKE 28 Dec 2018

Knowledge representation learning (KRL) aims to represent entities and relations in knowledge graph in low-dimensional semantic space, which have been widely used in massive knowledge-driven tasks.

General Classification Information Retrieval +7

Robustly Pre-trained Neural Model for Direct Temporal Relation Extraction

makcedward/nlpaug 13 Apr 2020

Background: Identifying relationships between clinical events and temporal expressions is a key challenge in meaningfully analyzing clinical text for use in advanced AI applications.

Language Modelling Relation Extraction

The Natural Language Decathlon: Multitask Learning as Question Answering

salesforce/decaNLP ICLR 2019

Though designed for decaNLP, MQAN also achieves state of the art results on the WikiSQL semantic parsing task in the single-task setting.

Domain Adaptation Machine Translation +9

ERNIE: Enhanced Language Representation with Informative Entities

thunlp/ERNIE ACL 2019

Neural language representation models such as BERT pre-trained on large-scale corpora can well capture rich semantic patterns from plain text, and be fine-tuned to consistently improve the performance of various NLP tasks.

Entity Linking Entity Typing +5

A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks

huggingface/hmtl 14 Nov 2018

The model is trained in a hierarchical fashion to introduce an inductive bias by supervising a set of low level tasks at the bottom layers of the model and more complex tasks at the top layers of the model.

Ranked #7 on Relation Extraction on ACE 2005 (using extra training data)

Multi-Task Learning Named Entity Recognition +1