Isomorphism Testing

8 papers with code • 0 benchmarks • 0 datasets

To test the power of graph representation learning methods based on Isomorphism Testing

Most implemented papers

Transitivity-Preserving Graph Representation Learning for Bridging Local Connectivity and Role-based Similarity

nslab-cuk/unified-graph-transformer 18 Aug 2023

In this paper, we propose Unified Graph Transformer Networks (UGT) that effectively integrate local and global structural information into fixed-length vector representations.

On the equivalence between graph isomorphism testing and function approximation with GNNs

leichen2018/Ring-GNN NeurIPS 2019

We further develop a framework of the expressive power of GNNs that incorporates both of these viewpoints using the language of sigma-algebra, through which we compare the expressive power of different types of GNNs together with other graph isomorphism tests.

Can Graph Neural Networks Count Substructures?

leichen2018/GNN-Substructure-Counting NeurIPS 2020

We also prove positive results for k-WL and k-IGNs as well as negative results for k-WL with a finite number of iterations.

On Graph Neural Networks versus Graph-Augmented MLPs

leichen2018/GNN_vs_GAMLP ICLR 2021

From the perspective of expressive power, this work compares multi-layer Graph Neural Networks (GNNs) with a simplified alternative that we call Graph-Augmented Multi-Layer Perceptrons (GA-MLPs), which first augments node features with certain multi-hop operators on the graph and then applies an MLP in a node-wise fashion.

Weisfeiler and Leman Go Infinite: Spectral and Combinatorial Pre-Colorings

tpfi22/spectral-and-combinatorial 31 Jan 2022

Two popular alternatives that offer a good trade-off between expressive power and computational efficiency are combinatorial (i. e., obtained via the Weisfeiler-Leman (WL) test) and spectral invariants.

Gradual Weisfeiler-Leman: Slow and Steady Wins the Race

frareba/gradualweisfeilerleman 19 Sep 2022

The classical Weisfeiler-Leman algorithm aka color refinement is fundamental for graph learning with kernels and neural networks.

A Practical, Progressively-Expressive GNN

lingxiaoshawn/kcsetgnn 18 Oct 2022

Our model is practical and progressively-expressive, increasing in power with k and c. We demonstrate effectiveness on several benchmark datasets, achieving several state-of-the-art results with runtime and memory usage applicable to practical graphs.

PlanE: Representation Learning over Planar Graphs

zzysonny/plane NeurIPS 2023

Graph neural networks are prominent models for representation learning over graphs, where the idea is to iteratively compute representations of nodes of an input graph through a series of transformations in such a way that the learned graph function is isomorphism invariant on graphs, which makes the learned representations graph invariants.