Search Results for author: Alessio Figalli

Found 5 papers, 3 papers with code

A Two-Scale Complexity Measure for Deep Learning Models

no code implementations17 Jan 2024 Massimiliano Datres, Gian Paolo Leonardi, Alessio Figalli, David Sutter

We introduce a novel capacity measure 2sED for statistical models based on the effective dimension.

Infinite-width limit of deep linear neural networks

1 code implementation29 Nov 2022 Lénaïc Chizat, Maria Colombo, Xavier Fernández-Real, Alessio Figalli

We finally study the continuous-time limit obtained for infinitely wide linear neural networks and show that the linear predictors of the neural network converge at an exponential rate to the minimal $\ell_2$-norm minimizer of the risk.

Effective dimension of machine learning models

1 code implementation9 Dec 2021 Amira Abbas, David Sutter, Alessio Figalli, Stefan Woerner

Making statements about the performance of trained models on tasks involving new data is one of the primary goals of machine learning, i. e., to understand the generalization power of a model.

BIG-bench Machine Learning

The power of quantum neural networks

2 code implementations30 Oct 2020 Amira Abbas, David Sutter, Christa Zoufal, Aurélien Lucchi, Alessio Figalli, Stefan Woerner

We show that quantum neural networks are able to achieve a significantly better effective dimension than comparable classical neural networks.

BIG-bench Machine Learning Quantum Machine Learning

A scale-dependent notion of effective dimension

no code implementations29 Jan 2020 Oksana Berezniuk, Alessio Figalli, Raffaele Ghigliazza, Kharen Musaelian

We introduce a notion of "effective dimension" of a statistical model based on the number of cubes of size $1/\sqrt{n}$ needed to cover the model space when endowed with the Fisher Information Matrix as metric, $n$ being the number of observations.

Cannot find the paper you are looking for? You can Submit a new open access paper.