Knowledge Graph Completion

208 papers with code • 7 benchmarks • 16 datasets

Knowledge graphs $G$ are represented as a collection of triples $\{(h, r, t)\}\subseteq E\times R\times E$, where $E$ and $R$ are the entity set and relation set. The task of Knowledge Graph Completion is to either predict unseen relations $r$ between two existing entities: $(h, ?, t)$ or predict the tail entity $t$ given the head entity and the query relation: $(h, r, ?)$.

Source: One-Shot Relational Learning for Knowledge Graphs

Libraries

Use these libraries to find Knowledge Graph Completion models and implementations

Latest papers with no code

ConvD: Attention Enhanced Dynamic Convolutional Embeddings for Knowledge Graph Completion

no code yet • 11 Dec 2023

In this paper, we propose a novel dynamic convolutional embedding model ConvD for knowledge graph completion, which directly reshapes the relation embeddings into multiple internal convolution kernels to improve the external convolution kernels of the traditional convolutional embedding model.

Federated Knowledge Graph Completion via Latent Embedding Sharing and Tensor Factorization

no code yet • 17 Nov 2023

To address these issues, we propose a novel method, i. e., Federated Latent Embedding Sharing Tensor factorization (FLEST), which is a novel approach using federated tensor factorization for KG completion.

Does Pre-trained Language Model Actually Infer Unseen Links in Knowledge Graph Completion?

no code yet • 15 Nov 2023

Knowledge Graph Completion (KGC) is a task that infers unseen relationships between entities in a KG.

Ensembling Textual and Structure-Based Models for Knowledge Graph Completion

no code yet • 7 Nov 2023

We consider two popular approaches to Knowledge Graph Completion (KGC): textual models that rely on textual entity descriptions, and structure-based models that exploit the connectivity structure of the Knowledge Graph (KG).

Unifying Structure and Language Semantic for Efficient Contrastive Knowledge Graph Completion with Structured Entity Anchors

no code yet • 7 Nov 2023

In recent, pre-trained language model (PLM) based methods that utilize both textual and structural information are emerging, but their performances lag behind state-of-the-art (SOTA) structure-based methods or some methods lose their inductive inference capabilities in the process of fusing structure embedding to text encoder.

KERMIT: Knowledge Graph Completion of Enhanced Relation Modeling with Inverse Transformation

no code yet • 26 Sep 2023

Knowledge graph completion is a task that revolves around filling in missing triples based on the information available in a knowledge graph.

On the Aggregation of Rules for Knowledge Graph Completion

no code yet • 1 Sep 2023

Rule learning approaches for knowledge graph completion are efficient, interpretable and competitive to purely neural models.

MoCoSA: Momentum Contrast for Knowledge Graph Completion with Structure-Augmented Pre-trained Language Models

no code yet • 16 Aug 2023

However, they struggle with semantically rich real-world entities due to limited structural information and fail to generalize to unseen entities.

Simple Rule Injection for ComplEx Embeddings

no code yet • 7 Aug 2023

Recent works in neural knowledge graph inference attempt to combine logic rules with knowledge graph embeddings to benefit from prior knowledge.

Towards Semantically Enriched Embeddings for Knowledge Graph Completion

no code yet • 31 Jul 2023

Most of the current algorithms consider a KG as a multidirectional labeled graph and lack the ability to capture the semantics underlying the schematic information.