no code implementations • 21 Apr 2023 • Matthieu Cordonnier, Nicolas Keriven, Nicolas Tremblay, Samuel Vaiter
We study the convergence of message passing graph neural networks on random graph models to their continuous counterpart as the number of nodes tends to infinity.
no code implementations • 23 Oct 2020 • Yuuki Takai, Akiyoshi Sannai, Matthieu Cordonnier
The classical approach to measure the expressive power of deep neural networks with piecewise linear activations is based on counting their maximum number of linear regions.
no code implementations • 5 Mar 2019 • Akiyoshi Sannai, Yuuki Takai, Matthieu Cordonnier
In this paper, we develop a theory about the relationship between $G$-invariant/equivariant functions and deep neural networks for finite group $G$.