Augmenting and Tuning Knowledge Graph Embeddings

1 Jul 2019  ·  Robert Bamler, Farnood Salehi, Stephan Mandt ·

Knowledge graph embeddings rank among the most successful methods for link prediction in knowledge graphs, i.e., the task of completing an incomplete collection of relational facts. A downside of these models is their strong sensitivity to model hyperparameters, in particular regularizers, which have to be extensively tuned to reach good performance [Kadlec et al., 2017]. We propose an efficient method for large scale hyperparameter tuning by interpreting these models in a probabilistic framework. After a model augmentation that introduces per-entity hyperparameters, we use a variational expectation-maximization approach to tune thousands of such hyperparameters with minimal additional cost. Our approach is agnostic to details of the model and results in a new state of the art in link prediction on standard benchmark data.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k DistMult (after variational EM) MRR 0.841 # 5
Hits@10 0.914 # 3
Link Prediction FB15k-237 DistMult (after variational EM) MRR 0.357 # 24
Hits@10 0.548 # 17
Link Prediction WN18 DistMult (after variational EM) MRR 0.911 # 26
Link Prediction WN18RR DistMult (after variational EM) MRR 0.455 # 55

Methods


No methods listed for this paper. Add relevant methods here