Search Results for author: James Rowbottom

Found 5 papers, 5 papers with code

Understanding convolution on graphs via energies

2 code implementations22 Jun 2022 Francesco Di Giovanni, James Rowbottom, Benjamin P. Chamberlain, Thomas Markovich, Michael M. Bronstein

We do so by showing that linear graph convolutions with symmetric weights minimize a multi-particle energy that generalizes the Dirichlet energy; in this setting, the weight matrices induce edge-wise attraction (repulsion) through their positive (negative) eigenvalues, thereby controlling whether the features are being smoothed or sharpened.

Inductive Bias Node Classification

Equivariant Mesh Attention Networks

1 code implementation21 May 2022 Sourya Basu, Jose Gallego-Posada, Francesco Viganò, James Rowbottom, Taco Cohen

Equivariance to symmetries has proven to be a powerful inductive bias in deep learning research.

Inductive Bias

Graph-Coupled Oscillator Networks

1 code implementation4 Feb 2022 T. Konstantin Rusch, Benjamin P. Chamberlain, James Rowbottom, Siddhartha Mishra, Michael M. Bronstein

This demonstrates that the proposed framework mitigates the oversmoothing problem.

Beltrami Flow and Neural Diffusion on Graphs

1 code implementation NeurIPS 2021 Benjamin Paul Chamberlain, James Rowbottom, Davide Eynard, Francesco Di Giovanni, Xiaowen Dong, Michael M Bronstein

We propose a novel class of graph neural networks based on the discretised Beltrami flow, a non-Euclidean diffusion PDE.

GRAND: Graph Neural Diffusion

1 code implementation NeurIPS Workshop DLDE 2021 Benjamin Paul Chamberlain, James Rowbottom, Maria Gorinova, Stefan Webb, Emanuele Rossi, Michael M. Bronstein

We present Graph Neural Diffusion (GRAND) that approaches deep learning on graphs as a continuous diffusion process and treats Graph Neural Networks (GNNs) as discretisations of an underlying PDE.

Graph Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.