Orthogonal Relation Transforms with Graph Context Modeling for Knowledge Graph Embedding

Translational distance-based knowledge graph embedding has shown progressive improvements on the link prediction task, from TransE to the latest state-of-the-art RotatE. However, N-1, 1-N and N-N predictions still remain challenging. In this work, we propose a novel translational distance-based approach for knowledge graph link prediction. The proposed method includes two-folds, first we extend the RotatE from 2D complex domain to high dimension space with orthogonal transforms to model relations for better modeling capacity. Second, the graph context is explicitly modeled via two directed context representations. These context representations are used as part of the distance scoring function to measure the plausibility of the triples during training and inference. The proposed approach effectively improves prediction accuracy on the difficult N-1, 1-N and N-N cases for knowledge graph link prediction task. The experimental results show that it achieves better performance on two benchmark data sets compared to the baseline RotatE, especially on data set (FB15k-237) with many high in-degree connection nodes.

PDF Abstract ACL 2020 PDF ACL 2020 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k-237 GC-OTE MRR 0.361 # 21
Hits@10 0.550 # 11
Hits@3 0.396 # 16
Hits@1 0.267 # 18
MR 154 # 9
Link Prediction WN18RR GC-OTE MRR 0.491 # 18
Hits@10 0.583 # 19
Hits@3 0.511 # 16
Hits@1 0.442 # 30
MR 2715 # 18

Methods