Knowledge Graph Completion
208 papers with code • 7 benchmarks • 16 datasets
Knowledge graphs $G$ are represented as a collection of triples $\{(h, r, t)\}\subseteq E\times R\times E$, where $E$ and $R$ are the entity set and relation set. The task of Knowledge Graph Completion is to either predict unseen relations $r$ between two existing entities: $(h, ?, t)$ or predict the tail entity $t$ given the head entity and the query relation: $(h, r, ?)$.
Libraries
Use these libraries to find Knowledge Graph Completion models and implementationsDatasets
Subtasks
Latest papers with no code
ConvD: Attention Enhanced Dynamic Convolutional Embeddings for Knowledge Graph Completion
In this paper, we propose a novel dynamic convolutional embedding model ConvD for knowledge graph completion, which directly reshapes the relation embeddings into multiple internal convolution kernels to improve the external convolution kernels of the traditional convolutional embedding model.
Federated Knowledge Graph Completion via Latent Embedding Sharing and Tensor Factorization
To address these issues, we propose a novel method, i. e., Federated Latent Embedding Sharing Tensor factorization (FLEST), which is a novel approach using federated tensor factorization for KG completion.
Does Pre-trained Language Model Actually Infer Unseen Links in Knowledge Graph Completion?
Knowledge Graph Completion (KGC) is a task that infers unseen relationships between entities in a KG.
Ensembling Textual and Structure-Based Models for Knowledge Graph Completion
We consider two popular approaches to Knowledge Graph Completion (KGC): textual models that rely on textual entity descriptions, and structure-based models that exploit the connectivity structure of the Knowledge Graph (KG).
Unifying Structure and Language Semantic for Efficient Contrastive Knowledge Graph Completion with Structured Entity Anchors
In recent, pre-trained language model (PLM) based methods that utilize both textual and structural information are emerging, but their performances lag behind state-of-the-art (SOTA) structure-based methods or some methods lose their inductive inference capabilities in the process of fusing structure embedding to text encoder.
KERMIT: Knowledge Graph Completion of Enhanced Relation Modeling with Inverse Transformation
Knowledge graph completion is a task that revolves around filling in missing triples based on the information available in a knowledge graph.
On the Aggregation of Rules for Knowledge Graph Completion
Rule learning approaches for knowledge graph completion are efficient, interpretable and competitive to purely neural models.
MoCoSA: Momentum Contrast for Knowledge Graph Completion with Structure-Augmented Pre-trained Language Models
However, they struggle with semantically rich real-world entities due to limited structural information and fail to generalize to unseen entities.
Simple Rule Injection for ComplEx Embeddings
Recent works in neural knowledge graph inference attempt to combine logic rules with knowledge graph embeddings to benefit from prior knowledge.
Towards Semantically Enriched Embeddings for Knowledge Graph Completion
Most of the current algorithms consider a KG as a multidirectional labeled graph and lack the ability to capture the semantics underlying the schematic information.