Knowledge Graph Completion
201 papers with code • 7 benchmarks • 16 datasets
Knowledge graphs $G$ are represented as a collection of triples $\{(h, r, t)\}\subseteq E\times R\times E$, where $E$ and $R$ are the entity set and relation set. The task of Knowledge Graph Completion is to either predict unseen relations $r$ between two existing entities: $(h, ?, t)$ or predict the tail entity $t$ given the head entity and the query relation: $(h, r, ?)$.
Libraries
Use these libraries to find Knowledge Graph Completion models and implementationsDatasets
Subtasks
Latest papers
The Power of Noise: Toward a Unified Multi-modal Knowledge Graph Representation Framework
In this work, to evaluate models' ability to accurately embed entities within MMKGs, we focus on two widely researched tasks: Multi-modal Knowledge Graph Completion (MKGC) and Multi-modal Entity Alignment (MMEA).
Counterfactual Reasoning with Knowledge Graph Embeddings
We further observe that KGEs adapted with COULDD solidly detect plausible counterfactual changes to the graph that follow these patterns.
Multi-perspective Improvement of Knowledge Graph Completion with Large Language Models
Knowledge graph completion (KGC) is a widely used method to tackle incompleteness in knowledge graphs (KGs) by making predictions for missing links.
Unleashing the Power of Imbalanced Modality Information for Multi-modal Knowledge Graph Completion
To address the mentioned problems, we propose Adaptive Multi-modal Fusion and Modality Adversarial Training (AdaMF-MAT) to unleash the power of imbalanced modality information for MMKGC.
UrbanKGent: A Unified Large Language Model Agent Framework for Urban Knowledge Graph Construction
Urban knowledge graph has recently worked as an emerging building block to distill critical knowledge from multi-sourced urban data for diverse urban application scenarios.
Knowledge Graphs Meet Multi-Modal Learning: A Comprehensive Survey
In this survey, we carefully review over 300 articles, focusing on KG-aware research in two principal aspects: KG-driven Multi-Modal (KG4MM) learning, where KGs support multi-modal tasks, and Multi-Modal Knowledge Graph (MM4KG), which extends KG studies into the MMKG realm.
KICGPT: Large Language Model with Knowledge in Context for Knowledge Graph Completion
Knowledge Graph Completion (KGC) is crucial for addressing knowledge graph incompleteness and supporting downstream applications.
Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion
This paper proposes a progressive distillation method based on masked generation features for KGC task, aiming to significantly reduce the complexity of pre-trained models.
Prompting Disentangled Embeddings for Knowledge Graph Completion with Pre-trained Language Model
Accordingly, we propose a new KGC method named PDKGC with two prompts -- a hard task prompt which is to adapt the KGC task to the PLM pre-training task of token prediction, and a disentangled structure prompt which learns disentangled graph representation so as to enable the PLM to combine more relevant structure knowledge with the text information.
Increasing Coverage and Precision of Textual Information in Multilingual Knowledge Graphs
Recent work in Natural Language Processing and Computer Vision has been using textual information -- e. g., entity names and descriptions -- available in knowledge graphs to ground neural models to high-quality structured data.