42 papers with code • 0 benchmarks • 0 datasets
Knowledge base completion is the task which automatically infers missing facts by reasoning about the information already present in the knowledge base. A knowledge base is a collection of relational facts, often represented in the form of "subject", "relation", "object"-triples.
The recent proliferation of knowledge graphs (KGs) coupled with incomplete or partial information, in the form of missing relations (links) between entities, has fueled a lot of research on knowledge base completion (also known as relation prediction).
Ranked #1 on Knowledge Graph Completion on FB15k-237
Knowledge graphs (KGs) of real-world facts about entities and their relationships are useful resources for a variety of natural language processing tasks.
This paper tackles the problem of endogenous link prediction for Knowledge Base completion.
We present KBLRN, a framework for end-to-end learning of knowledge base representations from latent, relational, and numerical features.
This framework is independent of the concrete form of generator and discriminator, and therefore can utilize a wide variety of knowledge graph embedding models as its building blocks.
Ranked #18 on Link Prediction on WN18
This 3-column matrix is then fed to a convolution layer where multiple filters are operated on the matrix to generate different feature maps.
Ranked #6 on Link Prediction on FB15k-237