MEIM: Multi-partition Embedding Interaction Beyond Block Term Format for Efficient and Expressive Link Prediction

30 Sep 2022  ยท  Hung Nghiep Tran, Atsuhiro Takasu ยท

Knowledge graph embedding aims to predict the missing relations between entities in knowledge graphs. Tensor-decomposition-based models, such as ComplEx, provide a good trade-off between efficiency and expressiveness, that is crucial because of the large size of real world knowledge graphs. The recent multi-partition embedding interaction (MEI) model subsumes these models by using the block term tensor format and provides a systematic solution for the trade-off. However, MEI has several drawbacks, some of which carried from its subsumed tensor-decomposition-based models. In this paper, we address these drawbacks and introduce the Multi-partition Embedding Interaction iMproved beyond block term format (MEIM) model, with independent core tensor for ensemble effects and soft orthogonality for max-rank mapping, in addition to multi-partition embedding. MEIM improves expressiveness while still being highly efficient, helping it to outperform strong baselines and achieve state-of-the-art results on difficult link prediction benchmarks using fairly small embedding sizes. The source code is released at https://github.com/tranhungnghiep/MEIM-KGE.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k-237 MEIM MRR 0.369 # 11
Hits@10 0.557 # 8
Hits@3 0.406 # 7
Hits@1 0.274 # 10
Link Prediction WN18RR MEIM MRR 0.499 # 9
Hits@10 0.577 # 28
Hits@3 0.518 # 10
Hits@1 0.458 # 9
Link Prediction YAGO3-10 MEIM MRR 0.585 # 1
Hits@10 0.716 # 1
Hits@1 0.514 # 1
Hits@3 0.625 # 1

Methods