Graph Representation Learning

375 papers with code • 1 benchmarks • 6 datasets

The goal of Graph Representation Learning is to construct a set of features (‘embeddings’) representing the structure of the graph and the data thereon. We can distinguish among Node-wise embeddings, representing each node of the graph, Edge-wise embeddings, representing each edge in the graph, and Graph-wise embeddings representing the graph as a whole.

Source: SIGN: Scalable Inception Graph Neural Networks

Libraries

Use these libraries to find Graph Representation Learning models and implementations

Most implemented papers

QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question Answering

michiyasunaga/qagnn NAACL 2021

The problem of answering questions using knowledge from pre-trained language models (LMs) and knowledge graphs (KGs) presents two challenges: given a QA context (question and answer choice), methods need to (i) identify relevant knowledge from large KGs, and (ii) perform joint reasoning over the QA context and KG.

Do Transformers Really Perform Bad for Graph Representation?

Microsoft/Graphormer 9 Jun 2021

Our key insight to utilizing Transformer in the graph is the necessity of effectively encoding the structural information of a graph into the model.

Effect of Choosing Loss Function when Using T-batching for Representation Learning on Dynamic Networks

erfanloghmani/effect-of-loss-function-tbatching 13 Aug 2023

These findings underscore the efficacy of the proposed loss functions in dynamic network modeling.

Hyperbolic Neural Networks

dalab/hyperbolic_nn NeurIPS 2018

However, the representational power of hyperbolic geometry is not yet on par with Euclidean geometry, mostly because of the absence of corresponding hyperbolic neural network layers.

Deep Graph Contrastive Representation Learning

CRIPAC-DIG/GRACE 7 Jun 2020

Moreover, our unsupervised method even surpasses its supervised counterparts on transductive tasks, demonstrating its great potential in real-world applications.

Towards Deeper Graph Neural Networks

divelab/DeeperGNN 18 Jul 2020

Based on our theoretical and empirical analysis, we propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.

Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning

yzjiao/Subg-Con 22 Sep 2020

Instead of learning on the complete input graph data, with a novel data augmentation strategy, \textsc{Subg-Con} learns node representations through a contrastive loss defined based on subgraphs sampled from the original graph instead.

Large-Scale Representation Learning on Graphs via Bootstrapping

nerdslab/bgrl ICLR 2022

To address these challenges, we introduce Bootstrapped Graph Latents (BGRL) - a graph representation learning method that learns by predicting alternative augmentations of the input.

Towards a Unified Framework for Fair and Stable Graph Representation Learning

chirag126/nifty 25 Feb 2021

In this work, we establish a key connection between counterfactual fairness and stability and leverage it to propose a novel framework, NIFTY (uNIfying Fairness and stabiliTY), which can be used with any GNN to learn fair and stable representations.

Structure-Aware Transformer for Graph Representation Learning

borgwardtlab/sat 7 Feb 2022

Here, we show that the node representations generated by the Transformer with positional encoding do not necessarily capture structural similarity between them.