Graph Representation Learning
376 papers with code • 1 benchmarks • 6 datasets
The goal of Graph Representation Learning is to construct a set of features (‘embeddings’) representing the structure of the graph and the data thereon. We can distinguish among Node-wise embeddings, representing each node of the graph, Edge-wise embeddings, representing each edge in the graph, and Graph-wise embeddings representing the graph as a whole.
Libraries
Use these libraries to find Graph Representation Learning models and implementationsLatest papers with no code
Negative Sampling in Knowledge Graph Representation Learning: A Review
This comprehensive survey paper systematically reviews various negative sampling (NS) methods and their contributions to the success of KGRL.
LocalGCL: Local-aware Contrastive Learning for Graphs
Graph representation learning (GRL) makes considerable progress recently, which encodes graphs with topological structures into low-dimensional embeddings.
One-Shot Graph Representation Learning Using Hyperdimensional Computing
We present a novel, simple, fast, and efficient approach for semi-supervised learning on graphs.
Adversarial Curriculum Graph Contrastive Learning with Pair-wise Augmentation
To address this challenge, we propose an innovative framework: Adversarial Curriculum Graph Contrastive Learning (ACGCL), which capitalizes on the merits of pair-wise augmentation to engender graph-level positive and negative samples with controllable similarity, alongside subgraph contrastive learning to discern effective graph patterns therein.
Understanding Survey Paper Taxonomy about Large Language Models via Graph Representation Learning
As new research on Large Language Models (LLMs) continues, it is difficult to keep up with new research and models.
Position Paper: Challenges and Opportunities in Topological Deep Learning
Topological deep learning (TDL) is a rapidly evolving field that uses topological features to understand and design deep learning models.
ExGRG: Explicitly-Generated Relation Graph for Self-Supervised Representation Learning
Self-supervised Learning (SSL) has emerged as a powerful technique in pre-training deep learning models without relying on expensive annotated labels, instead leveraging embedded signals in unlabeled data.
On provable privacy vulnerabilities of graph representations
Graph representation learning (GRL) is critical for extracting insights from complex network structures, but it also raises security concerns due to potential privacy vulnerabilities in these representations.
EXGC: Bridging Efficiency and Explainability in Graph Condensation
Graph representation learning on vast datasets, like web data, has made significant strides.
A Graph is Worth $K$ Words: Euclideanizing Graph using Pure Transformer
Despite recent GNN and Graphformer efforts encoding graphs as Euclidean vectors, recovering original graph from the vectors remains a challenge.