Relation Classification

140 papers with code • 8 benchmarks • 23 datasets

Relation Classification is the task of identifying the semantic relation holding between two nominal entities in text.

Source: Structure Regularized Neural Network for Entity Relation Classification for Chinese Literature Text

Most implemented papers

Improved Relation Extraction with Feature-Rich Compositional Embedding Models

mgormley/pacaya EMNLP 2015

We propose a Feature-rich Compositional Embedding Model (FCM) for relation extraction that is expressive, generalizes to new domains, and is easy-to-implement.

Relation Classification via Recurrent Neural Network

DavidMortensen/Bachelor-readinglist 5 Aug 2015

Deep learning has gained much success in sentence-level relation classification.

A Latent Variable Recurrent Neural Network for Discourse Relation Language Models

jiyfeng/drlm 7 Mar 2016

This paper presents a novel latent variable recurrent neural network architecture for jointly modeling sequences of words and (possibly latent) discourse relations between adjacent sentences.

Learning Semantically and Additively Compositional Distributional Representations

tianran/vecdcs ACL 2016

This paper connects a vector-based composition model to a formal semantics, the Dependency-based Compositional Semantics (DCS).

CogALex-V Shared Task: LexNET - Integrated Path-based and Distributional Method for the Identification of Semantic Relations

vered1986/LexNET WS 2016

The reported results in the shared task bring this submission to the third place on subtask 1 (word relatedness), and the first place on subtask 2 (semantic relation classification), demonstrating the utility of integrating the complementary path-based and distributional information sources in recognizing concrete semantic relations.

CATENA: CAusal and TEmporal relation extraction from NAtural language texts

paramitamirza/CATENA COLING 2016

The effects of the interaction between the temporal and the causal components, although limited, yield promising results and confirm the tight connection between the temporal and the causal dimension of texts.

Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction

pgcool/TF-MTRNN COLING 2016

This paper proposes a novel context-aware joint entity and word-level relation extraction approach through semantic composition of words, introducing a Table Filling Multi-Task Recurrent Neural Network (TF-MTRNN) model that reduces the entity recognition and relation classification tasks to a table-filling problem and models their interdependencies.