DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs

In this paper, we study the problem of learning probabilistic logical rules for inductive and interpretable link prediction. Despite the importance of inductive link prediction, most previous works focused on transductive link prediction and cannot manage previously unseen entities. Moreover, they are black-box models that are not easily explainable for humans. We propose DRUM, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs which resolves these problems. We motivate our method by making a connection between learning confidence scores for each rule and low-rank tensor approximation. DRUM uses bidirectional RNNs to share useful information across the tasks of learning rules for different relations. We also empirically demonstrate the efficiency of DRUM over existing rule mining methods for inductive link prediction on a variety of benchmark datasets.

PDF Abstract NeurIPS 2019 PDF NeurIPS 2019 Abstract

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k-237 DRUM (T=3) MRR 0.343 # 40
Hits@10 0.516 # 43
Hits@3 0.378 # 31
Hits@1 0.255 # 31
Link Prediction WN18RR DRUM (T=3) MRR 0.486 # 28
Hits@10 0.586 # 15
Hits@3 0.513 # 15
Hits@1 0.425 # 44