Paper

Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

Attention mechanisms are often used in deep neural networks for distantly supervised relation extraction (DS-RE) to distinguish valid from noisy instances. However, traditional 1-D vector attention models are insufficient for the learning of different contexts in the selection of valid instances to predict the relationship for an entity pair... (read more)

Results in Papers With Code
(↓ scroll down to see all results)