Knowledge Base Completion
64 papers with code • 0 benchmarks • 2 datasets
Knowledge base completion is the task which automatically infers missing facts by reasoning about the information already present in the knowledge base. A knowledge base is a collection of relational facts, often represented in the form of "subject", "relation", "object"-triples.
Benchmarks
These leaderboards are used to track progress in Knowledge Base Completion
Most implemented papers
Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs
The recent proliferation of knowledge graphs (KGs) coupled with incomplete or partial information, in the form of missing relations (links) between entities, has fueled a lot of research on knowledge base completion (also known as relation prediction).
Tensor Decompositions for temporal knowledge base completion
Additionally, we propose a new dataset for knowledge base completion constructed from Wikidata, larger than previous benchmarks by an order of magnitude, as a new reference for evaluating temporal and non-temporal link prediction methods.
Lossless Compression of Structured Convolutional Models via Lifting
The computation graphs themselves then reflect the symmetries of the underlying data, similarly to the lifted graphical models.
Modeling Relation Paths for Representation Learning of Knowledge Bases
Representation learning of knowledge bases (KBs) aims to embed both entities and relations into a low-dimensional space.
STransE: a novel embedding model of entities and relationships in knowledge bases
Knowledge bases of real-world facts about entities and their relationships are useful resources for a variety of natural language processing tasks.
Knowledge Transfer for Out-of-Knowledge-Base Entities: A Graph Neural Network Approach
Knowledge base completion (KBC) aims to predict missing information in a knowledge base. In this paper, we address the out-of-knowledge-base (OOKB) entity problem in KBC:how to answer queries concerning test entities not observed at training time.
Fast Linear Model for Knowledge Graph Embeddings
This paper shows that a simple baseline based on a Bag-of-Words (BoW) representation learns surprisingly good knowledge graph embeddings.
Revisiting Simple Neural Networks for Learning Representations of Knowledge Graphs
We address the problem of learning vector representations for entities and relations in Knowledge Graphs (KGs) for Knowledge Base Completion (KBC).
Interpretable and Compositional Relation Learning by Joint Training with an Autoencoder
Embedding models for entities and relations are extremely useful for recovering missing facts in a knowledge base.