Quantum and Translation Embedding for Knowledge Graph Completion

1 Jan 2021  ·  Panfeng Chen, Yisong Wang, Renyan Feng, Xiaomin Yu, Quan Yu ·

Knowledge Graph Completion (KGC) mainly devotes to link predicting for an entity pair in Knowledge Graph (KG) according to known facts. In this work, we present a novel model for this end. In this model, Quantum and Translation Embedding are used as components for logical and structural feature capturing in the same vector subspace, respectively. The two components have synergy with each other and achieve impressive performance at low cost which is close to the efficient model TransE. Surprisingly, the performance on challenging datasets such as fb15k237 and WN18RR is up to 94.89% and 92.79% in metric Hits@1 while the dimension of embedding is only 4 in the process of training. The insight of this work enlightens the notion of dense feature model design for KGC which is a new alternative to Deep Neural networks (DNN) in this task or even a better choice.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods