A Generalization of ViT/MLP-Mixer to Graphs

27 Dec 2022  ·  Xiaoxin He, Bryan Hooi, Thomas Laurent, Adam Perold, Yann Lecun, Xavier Bresson ·

Graph Neural Networks (GNNs) have shown great potential in the field of graph representation learning. Standard GNNs define a local message-passing mechanism which propagates information over the whole graph domain by stacking multiple layers. This paradigm suffers from two major limitations, over-squashing and poor long-range dependencies, that can be solved using global attention but significantly increases the computational cost to quadratic complexity. In this work, we propose an alternative approach to overcome these structural limitations by leveraging the ViT/MLP-Mixer architectures introduced in computer vision. We introduce a new class of GNNs, called Graph ViT/MLP-Mixer, that holds three key properties. First, they capture long-range dependency and mitigate the issue of over-squashing as demonstrated on Long Range Graph Benchmark and TreeNeighbourMatch datasets. Second, they offer better speed and memory efficiency with a complexity linear to the number of nodes and edges, surpassing the related Graph Transformer and expressive GNN models. Third, they show high expressivity in terms of graph isomorphism as they can distinguish at least 3-WL non-isomorphic graphs. We test our architecture on 4 simulated datasets and 7 real-world benchmarks, and show highly competitive results on all of them. The source code is available for reproducibility at: \url{https://github.com/XiaoxinHe/Graph-ViT-MLPMixer}.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Graph Classification Peptides-func Graph ViT AP 0.6942±0.0075 # 3
Graph Classification Peptides-func GraphMLPMixer AP 0.6921±0.0054 # 4
Graph Regression Peptides-struct Graph ViT MAE 0.2449±0.0016 # 1
Graph Regression Peptides-struct GraphMLPMixer MAE 0.2475±0.0015 # 7
Graph Regression ZINC GraphMLPMixer MAE 0.075 ± 0.001 # 9

Methods