STaR: Knowledge Graph Embedding by Scaling, Translation and Rotation

15 Feb 2022  ·  Jiayi Li, Yujiu Yang ·

The bilinear method is mainstream in Knowledge Graph Embedding (KGE), aiming to learn low-dimensional representations for entities and relations in Knowledge Graph (KG) and complete missing links. Most of the existing works are to find patterns between relationships and effectively model them to accomplish this task. Previous works have mainly discovered 6 important patterns like non-commutativity. Although some bilinear methods succeed in modeling these patterns, they neglect to handle 1-to-N, N-to-1, and N-to-N relations (or complex relations) concurrently, which hurts their expressiveness. To this end, we integrate scaling, the combination of translation and rotation that can solve complex relations and patterns, respectively, where scaling is a simplification of projection. Therefore, we propose a corresponding bilinear model Scaling Translation and Rotation (STaR) consisting of the above two parts. Besides, since translation cannot be incorporated into the bilinear model directly, we introduce translation matrix as the equivalent. Theoretical analysis proves that STaR is capable of modeling all patterns and handling complex relations simultaneously, and experiments demonstrate its effectiveness on commonly used benchmarks for link prediction.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here