Joint Language Semantic and Structure Embedding for Knowledge Graph Completion

The task of completing knowledge triplets has broad downstream applications. Both structural and semantic information plays an important role in knowledge graph completion. Unlike previous approaches that rely on either the structures or semantics of the knowledge graphs, we propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information. Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models with respect to a probabilistic structured loss, where the forward pass of the language models captures semantics and the loss reconstructs structures. Our extensive experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method. We also show that our method can significantly improve the performance in a low-resource regime, thanks to the better use of semantics. The code and datasets are available at

PDF Abstract COLING 2022 PDF COLING 2022 Abstract

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k-237 LASS Hits@10 0.533 # 35
MR 108 # 3
Link Prediction UMLS LASS Hits@10 0.994 # 3
MR 1.39 # 3
Link Prediction WN18RR LASS Hits@10 0.786 # 4
MR 35 # 1


No methods listed for this paper. Add relevant methods here