Search Results for author: Matthieu Cordonnier

Found 3 papers, 0 papers with code

Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs

no code implementations21 Apr 2023 Matthieu Cordonnier, Nicolas Keriven, Nicolas Tremblay, Samuel Vaiter

We study the convergence of message passing graph neural networks on random graph models to their continuous counterpart as the number of nodes tends to infinity.

On the Number of Linear Functions Composing Deep Neural Network: Towards a Refined Definition of Neural Networks Complexity

no code implementations23 Oct 2020 Yuuki Takai, Akiyoshi Sannai, Matthieu Cordonnier

The classical approach to measure the expressive power of deep neural networks with piecewise linear activations is based on counting their maximum number of linear regions.

Relation

Universal approximations of permutation invariant/equivariant functions by deep neural networks

no code implementations5 Mar 2019 Akiyoshi Sannai, Yuuki Takai, Matthieu Cordonnier

In this paper, we develop a theory about the relationship between $G$-invariant/equivariant functions and deep neural networks for finite group $G$.

Cannot find the paper you are looking for? You can Submit a new open access paper.