KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation

13 Nov 2019  ·  Xiaozhi Wang, Tianyu Gao, Zhaocheng Zhu, Zhengyan Zhang, Zhiyuan Liu, Juanzi Li, Jian Tang ·

Pre-trained language representation models (PLMs) cannot well capture factual knowledge from text. In contrast, knowledge embedding (KE) methods can effectively represent the relational facts in knowledge graphs (KGs) with informative entity embeddings, but conventional KE models cannot take full advantage of the abundant textual information. In this paper, we propose a unified model for Knowledge Embedding and Pre-trained LanguagE Representation (KEPLER), which can not only better integrate factual knowledge into PLMs but also produce effective text-enhanced KE with the strong PLMs. In KEPLER, we encode textual entity descriptions with a PLM as their embeddings, and then jointly optimize the KE and language modeling objectives. Experimental results show that KEPLER achieves state-of-the-art performances on various NLP tasks, and also works remarkably well as an inductive KE model on KG link prediction. Furthermore, for pre-training and evaluating KEPLER, we construct Wikidata5M, a large-scale KG dataset with aligned entity descriptions, and benchmark state-of-the-art KE methods on it. It shall serve as a new KE benchmark and facilitate the research on large KG, inductive KE, and KG with text. The source code can be obtained from https://github.com/THU-KEG/KEPLER.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Relation Classification TACRED KEPLER F1 71.7 # 11
Relation Extraction TACRED KEPLER F1 71.7 # 15
Link Prediction Wikidata5M DistMult MRR 0.253 # 11
Hits@1 0.208 # 11
Hits@3 0.278 # 11
Hits@10 0.334 # 12
Link Prediction Wikidata5M SimplE MRR 0.296 # 8
Hits@1 0.252 # 8
Hits@3 0.317 # 8
Hits@10 0.377 # 9
Link Prediction Wikidata5M ComplEx MRR 0.281 # 10
Hits@1 0.228 # 10
Hits@3 0.310 # 10
Hits@10 0.373 # 10
Link Prediction Wikidata5M RotatE MRR 0.29 # 9
Hits@1 0.234 # 9
Hits@3 0.322 # 6
Hits@10 0.39 # 8
Link Prediction Wikidata5M TransE MRR 0.253 # 11
Hits@1 0.17 # 13
Hits@3 0.311 # 9
Hits@10 0.392 # 7
Link Prediction Wikidata5M KEPLER-Wiki-rel MRR 0.210 # 13
Hits@1 0.173 # 12
Hits@3 0.224 # 12
Hits@10 0.277 # 13
Inductive knowledge graph completion Wikidata5m-ind KEPLER-Wiki-rel Hits@10 0.73 # 2
MRR 0.402 # 2
Hits@1 0.222 # 2
Hits@3 0.514 # 2

Methods


No methods listed for this paper. Add relevant methods here