no code implementations • 28 Jun 2023 • Clarissa Lauditi, Emanuele Troiani, Marc Mézard
In recent years statistical physics has proven to be a valuable tool to probe into large dimensional inference problems such as the ones occurring in machine learning.
no code implementations • 18 May 2023 • Brandon Livio Annesi, Clarissa Lauditi, Carlo Lucibello, Enrico M. Malatesta, Gabriele Perugini, Fabrizio Pittorino, Luca Saglietti
Empirical studies on the landscape of neural networks have shown that low-energy configurations are often found in complex connected structures, where zero-energy paths between pairs of distant solutions can be constructed.
no code implementations • 29 Mar 2023 • Matteo Negri, Clarissa Lauditi, Gabriele Perugini, Carlo Lucibello, Enrico Malatesta
The Hopfield model is a paradigmatic model of neural networks that has been analyzed for many decades in the statistical physics, neuroscience, and machine learning communities.
no code implementations • 1 Oct 2021 • Carlo Baldassi, Clarissa Lauditi, Enrico M. Malatesta, Rosalba Pacelli, Gabriele Perugini, Riccardo Zecchina
Current deep neural networks are highly overparameterized (up to billions of connection weights) and nonlinear.
no code implementations • 2 Jul 2021 • Carlo Baldassi, Clarissa Lauditi, Enrico M. Malatesta, Gabriele Perugini, Riccardo Zecchina
The success of deep learning has revealed the application potential of neural networks across the sciences and opened up fundamental theoretical problems.