Learning Hierarchy-Aware Knowledge Graph Embeddings for Link Prediction

21 Nov 2019  ·  Zhanqiu Zhang, Jianyu Cai, Yongdong Zhang, Jie Wang ·

Knowledge graph embedding, which aims to represent entities and relations as low dimensional vectors (or matrices, tensors, etc.), has been shown to be a powerful technique for predicting missing links in knowledge graphs. Existing knowledge graph embedding models mainly focus on modeling relation patterns such as symmetry/antisymmetry, inversion, and composition. However, many existing approaches fail to model semantic hierarchies, which are common in real-world applications. To address this challenge, we propose a novel knowledge graph embedding model -- namely, Hierarchy-Aware Knowledge Graph Embedding (HAKE) -- which maps entities into the polar coordinate system. HAKE is inspired by the fact that concentric circles in the polar coordinate system can naturally reflect the hierarchy. Specifically, the radial coordinate aims to model entities at different levels of the hierarchy, and entities with smaller radii are expected to be at higher levels; the angular coordinate aims to distinguish entities at the same level of the hierarchy, and these entities are expected to have roughly the same radii but different angles. Experiments demonstrate that HAKE can effectively model the semantic hierarchies in knowledge graphs, and significantly outperforms existing state-of-the-art methods on benchmark datasets for the link prediction task.

PDF Abstract

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Knowledge Graph Completion FB15k-237 HAKE Hits@10 54.2 # 2
Link Prediction FB15k-237 HAKE MRR 0.346 # 36
Hits@3 0.381 # 24
Hits@1 0.25 # 32
Link Prediction WN18RR HAKE MRR 0.497 # 6
Hits@10 0.582 # 11
Hits@3 0.516 # 5
Hits@1 0.452 # 8
Link Prediction YAGO3-10 HAKE MRR 0.545 # 6
Hits@10 0.694 # 7
Hits@1 0.462 # 6
Hits@3 0.596 # 2


No methods listed for this paper. Add relevant methods here