Paper

LineaRE: Simple but Powerful Knowledge Graph Embedding for Link Prediction

The task of link prediction for knowledge graphs is to predict missing relationships between entities. Knowledge graph embedding, which aims to represent entities and relations of a knowledge graph as low dimensional vectors in a continuous vector space, has achieved promising predictive performance. If an embedding model can cover different types of connectivity patterns and mapping properties of relations as many as possible, it will potentially bring more benefits for link prediction tasks. In this paper, we propose a novel embedding model, namely LineaRE, which is capable of modeling four connectivity patterns (i.e., symmetry, antisymmetry, inversion, and composition) and four mapping properties (i.e., one-to-one, one-to-many, many-to-one, and many-to-many) of relations. Specifically, we regard knowledge graph embedding as a simple linear regression task, where a relation is modeled as a linear function of two low-dimensional vector-presented entities with two weight vectors and a bias vector. Since the vectors are defined in a real number space and the scoring function of the model is linear, our model is simple and scalable to large knowledge graphs. Experimental results on multiple widely used real-world datasets show that the proposed LineaRE model significantly outperforms existing state-of-the-art models for link prediction tasks.

Results in Papers With Code
(↓ scroll down to see all results)