Improving Distantly Supervised Relation Extraction using Word and Entity Based Attention

19 Apr 2018  ·  Sharmistha Jat, Siddhesh Khandelwal, Partha Talukdar ·

Relation extraction is the problem of classifying the relationship between two entities in a given sentence. Distant Supervision (DS) is a popular technique for developing relation extractors starting with limited supervision. We note that most of the sentences in the distant supervision relation extraction setting are very long and may benefit from word attention for better sentence representation. Our contributions in this paper are threefold. Firstly, we propose two novel word attention models for distantly- supervised relation extraction: (1) a Bi-directional Gated Recurrent Unit (Bi-GRU) based word attention model (BGWA), (2) an entity-centric attention model (EA), and (3) a combination model which combines multiple complementary models using weighted voting method for improved relation extraction. Secondly, we introduce GDS, a new distant supervision dataset for relation extraction. GDS removes test data noise present in all previous distant- supervision benchmark datasets, making credible automatic evaluation possible. Thirdly, through extensive experiments on multiple real-world datasets, we demonstrate the effectiveness of the proposed methods.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relationship Extraction (Distant Supervised) New York Times Corpus BGWA P@10% 70.9 # 4
P@30% 52.4 # 4

Methods


No methods listed for this paper. Add relevant methods here