KGPool: Dynamic Knowledge Graph Context Selection for Relation Extraction

We present a novel method for relation extraction (RE) from a single sentence, mapping the sentence and two given entities to a canonical fact in a knowledge graph (KG). Especially in this presumed sentential RE setting, the context of a single sentence is often sparse. This paper introduces the KGPool method to address this sparsity, dynamically expanding the context with additional facts from the KG. It learns the representation of these facts (entity alias, entity descriptions, etc.) using neural methods, supplementing the sentential context. Unlike existing methods that statically use all expanded facts, KGPool conditions this expansion on the sentence. We study the efficacy of KGPool by evaluating it with different neural models and KGs (Wikidata and NYT Freebase). Our experimental evaluation on standard datasets shows that by feeding the KGPool representation into a Graph Neural Network, the overall method is significantly more accurate than state-of-the-art methods.

PDF Abstract Findings (ACL) 2021 PDF Findings (ACL) 2021 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relationship Extraction (Distant Supervised) New York Times Corpus KGPOOL P@10% 92.3 # 1
P@30% 86.7 # 1
Relation Extraction NYT Corpus KGPOOL P@10% 92.3 # 1
P@30% 86.7 # 1


No methods listed for this paper. Add relevant methods here