132 papers with code • 8 benchmarks • 22 datasets
Relation Classification is the task of identifying the semantic relation holding between two nominal entities in text.
General purpose relation extractors, which can model arbitrary relations, are a core aspiration in information extraction.
In this paper, we propose new pretrained contextualized representations of words and entities based on the bidirectional transformer.
In this paper, we propose a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task.
Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity Typing
Our model not only utilizes entities and their latent types as features effectively but also is more interpretable by visualizing attention mechanisms applied to our model and results of LET.
This work presents our contribution in the context of the 6th task of SemEval-2020: Extracting Definitions from Free Text in Textbooks (DeftEval).
Relation classification is an important semantic processing task for which state-ofthe-art systems still rely on costly handcrafted features.
Experimental performance on the task of relation classification has generally improved using deep neural network architectures.