Low-Dimensional Hyperbolic Knowledge Graph Embeddings

Knowledge graph (KG) embeddings learn low-dimensional representations of entities and relations to predict missing facts. KGs often exhibit hierarchical and logical patterns which must be preserved in the embedding space. For hierarchical data, hyperbolic embedding methods have shown promise for high-fidelity and parsimonious representations. However, existing hyperbolic embedding methods do not account for the rich logical patterns in KGs. In this work, we introduce a class of hyperbolic KG embedding models that simultaneously capture hierarchical and logical patterns. Our approach combines hyperbolic reflections and rotations with attention to model complex relational patterns. Experimental results on standard KG benchmarks show that our method improves over previous Euclidean- and hyperbolic-based efforts by up to 6.1% in mean reciprocal rank (MRR) in low dimensions. Furthermore, we observe that different geometric transformations capture different types of relations while attention-based transformations generalize to multiple relations. In high dimensions, our approach yields new state-of-the-art MRRs of 49.6% on WN18RR and 57.7% on YAGO3-10.

PDF Abstract ACL 2020 PDF ACL 2020 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k-237 RefE MRR 0.351 # 30
Hits@10 0.541 # 23
Hits@3 0.390 # 20
Hits@1 0.256 # 29
Link Prediction WN18RR RotH MRR .496 # 14
Hits@10 0.586 # 15
Hits@3 0.514 # 14
Hits@1 0.449 # 18
Link Prediction YAGO3-10 RefE MRR 0.577 # 6
Hits@10 0.712 # 3
Hits@1 0.503 # 5
Hits@3 0.621 # 3

Methods


No methods listed for this paper. Add relevant methods here