no code implementations • 28 May 2024 • Francesco Cagnetta, Matthieu Wyart
We conjecture that the relationship between training set size and effective range of correlations holds beyond our synthetic datasets.
1 code implementation • 5 Jul 2023 • Francesco Cagnetta, Leonardo Petrini, Umberto M. Tomasini, Alessandro Favero, Matthieu Wyart
The model is a classification task where each class corresponds to a group of high-level features, chosen among several equivalent groups associated with the same class.
no code implementations • 5 Jul 2023 • Francesco Cagnetta, Deborah Oliveira, Mahalakshmi Sabanayagam, Nikolaos Tsilivis, Julia Kempe
Lecture notes from the course given by Professor Julia Kempe at the summer school "Statistical physics of Machine Learning" in Les Houches.
1 code implementation • 4 Oct 2022 • Umberto M. Tomasini, Leonardo Petrini, Francesco Cagnetta, Matthieu Wyart
Here, we (i) show empirically for various architectures that stability to image diffeomorphisms is achieved by both spatial and channel pooling, (ii) introduce a model scale-detection task which reproduces our empirical observations on spatial pooling and (iii) compute analitically how the sensitivity to diffeomorphisms and noise scales with depth due to spatial pooling.
1 code implementation • 1 Aug 2022 • Francesco Cagnetta, Alessandro Favero, Matthieu Wyart
Interestingly, we find that, despite their hierarchical structure, the functions generated by infinitely-wide deep CNNs are too rich to be efficiently learnable in high dimension.
1 code implementation • 24 Jun 2022 • Leonardo Petrini, Francesco Cagnetta, Eric Vanden-Eijnden, Matthieu Wyart
It is widely believed that the success of deep networks lies in their ability to learn a meaningful representation of the features of the data.
no code implementations • NeurIPS 2021 • Alessandro Favero, Francesco Cagnetta, Matthieu Wyart
Convolutional neural networks perform a local and translationally-invariant treatment of the data: quantifying which of these two aspects is central to their success remains a challenge.