Knowledge base completion is the task which automatically infers missing facts by reasoning about the information already present in the knowledge base. A knowledge base is a collection of relational facts, often represented in the form of "subject", "relation", "object"-triples.
We demonstrate the effectiveness of R-GCNs as a stand-alone model for entity classification.
SOTA for Node Classification on AIFB
This paper tackles the problem of endogenous link prediction for Knowledge Base completion.
The recent proliferation of knowledge graphs (KGs) coupled with incomplete or partial information, in the form of missing relations (links) between entities, has fueled a lot of research on knowledge base completion (also known as relation prediction).
#2 best model for Link Prediction on FB15k-237
This framework is independent of the concrete form of generator and discriminator, and therefore can utilize a wide variety of knowledge graph embedding models as its building blocks.
Knowledge bases (KBs) of real-world facts about entities and their relationships are useful resources for a variety of natural language processing tasks.
We present KBLRN, a framework for end-to-end learning of knowledge base representations from latent, relational, and numerical features.
This 3-column matrix is then fed to a convolution layer where multiple filters are operated on the matrix to generate different feature maps.
#17 best model for Link Prediction on WN18RR
The recent graph convolutional network (GCN) provides another way of learning graph node embedding by successfully utilizing graph connectivity structure.
Knowledge bases of real-world facts about entities and their relationships are useful resources for a variety of natural language processing tasks.