Pretrain-KGEs: Learning Knowledge Representation from Pretrained Models for Knowledge Graph Embeddings

1 Dec 2019  ·  Zhiyuan Zhang, Xiaoqian Liu, Yi Zhang, Qi Su, Xu sun, Bin He ·

Learning knowledge graph embeddings (KGEs) is an efficient approach to knowledge graph completion. Conventional KGEs often suffer from limited knowledge representation, which causes less accuracy especially when training on sparse knowledge graphs. To remedy this, we present Pretrain-KGEs, a training framework for learning better knowledgeable entity and relation embeddings, leveraging the abundant linguistic knowledge from pretrained language models. Specifically, we propose a unified approach in which we first learn entity and relation representations via pretrained language models and use the representations to initialize entity and relation embeddings for training KGE models. Our proposed method is model agnostic in the sense that it can be applied to any variant of KGE models. Experimental results show that our method can consistently improve results and achieve state-of-the-art performance using different KGE models such as TransE and QuatE, across four benchmark KG datasets in link prediction and triplet classification tasks.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here