Decompressing Knowledge Graph Representations for Link Prediction

11 Nov 2019  ·  Xiang Kong, Xianyang Chen, Eduard Hovy ·

This paper studies the problem of predicting missing relationships between entities in knowledge graphs through learning their representations. Currently, the majority of existing link prediction models employ simple but intuitive scoring functions and relatively small embedding size so that they could be applied to large-scale knowledge graphs. However, these properties also restrict the ability to learn more expressive and robust features. Therefore, diverging from most of the prior works which focus on designing new objective functions, we propose, DeCom, a simple but effective mechanism to boost the performance of existing link predictors such as DistMult, ComplEx, etc, through extracting more expressive features while preventing overfitting by adding just a few extra parameters. Specifically, embeddings of entities and relationships are first decompressed to a more expressive and robust space by decompressing functions, then knowledge graph embedding models are trained in this new feature space. Experimental results on several benchmark knowledge graphs and advanced link prediction systems demonstrate the generalization and effectiveness of our method. Especially, RESCAL + DeCom achieves state-of-the-art performance on the FB15k-237 benchmark across all evaluation metrics. In addition, we also show that compared with DeCom, explicitly increasing the embedding size significantly increase the number of parameters but could not achieve promising performance improvement.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k-237 RESCAL + Decom MRR 0.354 # 27
Hits@10 0.536 # 30
Hits@3 0.388 # 23
Hits@1 0.261 # 26
Link Prediction WN18RR RESCAL + Decom MRR 0.457 # 53
Hits@10 0.515 # 64
Hits@3 0.469 # 43
Hits@1 0.427 # 43

Methods