Search Results for author: Carlo Lucibello

Found 15 papers, 1 papers with code

The star-shaped space of solutions of the spherical negative perceptron

no code implementations18 May 2023 Brandon Livio Annesi, Clarissa Lauditi, Carlo Lucibello, Enrico M. Malatesta, Gabriele Perugini, Fabrizio Pittorino, Luca Saglietti

Empirical studies on the landscape of neural networks have shown that low-energy configurations are often found in complex connected structures, where zero-energy paths between pairs of distant solutions can be constructed.

Storage and Learning phase transitions in the Random-Features Hopfield Model

no code implementations29 Mar 2023 Matteo Negri, Clarissa Lauditi, Gabriele Perugini, Carlo Lucibello, Enrico Malatesta

The Hopfield model is a paradigmatic model of neural networks that has been analyzed for many decades in the statistical physics, neuroscience, and machine learning communities.

Retrieval

Noise-cleaning the precision matrix of fMRI time series

no code implementations6 Feb 2023 Miguel Ibáñez-Berganza, Carlo Lucibello, Francesca Santucci, Tommaso Gili, Andrea Gabrielli

We observe that the so called Optimal Rotationally Invariant Estimator, based on Random Matrix Theory, leads to a significantly lower distance from the true precision matrix in synthetic data, and higher test likelihood in natural fMRI data.

Denoising Time Series +1

Information-theoretical analysis of the neural code for decoupled face representation

no code implementations22 Aug 2022 Miguel Ibáñez-Berganza, Carlo Lucibello, Luca Mariani, Giovanni Pezzulo

Processing faces accurately and efficiently is a key capability of humans and other animals that engage in sophisticated social tasks.

Deep learning via message passing algorithms based on belief propagation

no code implementations27 Oct 2021 Carlo Lucibello, Fabrizio Pittorino, Gabriele Perugini, Riccardo Zecchina

Message-passing algorithms based on the Belief Propagation (BP) equations constitute a well-known distributed computational scheme.

Continual Learning

Reconstruction of Pairwise Interactions using Energy-Based Models

no code implementations ICLR Workshop EBM 2021 Christoph Feinauer, Carlo Lucibello

Pairwise models like the Ising model or the generalized Potts model have found many successful applications in fields like physics, biology, and economics.

Clustering of solutions in the symmetric binary perceptron

no code implementations15 Nov 2019 Carlo Baldassi, Riccardo Della Vecchia, Carlo Lucibello, Riccardo Zecchina

The geometrical features of the (non-convex) loss landscape of neural network models are crucial in ensuring successful optimization and, most importantly, the capability to generalize well.

Clustering

Generalized Approximate Survey Propagation for High-Dimensional Estimation

no code implementations13 May 2019 Luca Saglietti, Yue M. Lu, Carlo Lucibello

In Generalized Linear Estimation (GLE) problems, we seek to estimate a signal that is observed through a linear transform followed by a component-wise, possibly nonlinear and noisy, channel.

Retrieval Vocal Bursts Intensity Prediction

Critical initialisation in continuous approximations of binary neural networks

no code implementations ICLR 2020 George Stamatescu, Federica Gerace, Carlo Lucibello, Ian Fuss, Langford B. White

Moreover, we predict theoretically and confirm numerically, that common weight initialisation schemes used in standard continuous networks, when applied to the mean values of the stochastic binary weights, yield poor training performance.

On the role of synaptic stochasticity in training low-precision neural networks

no code implementations26 Oct 2017 Carlo Baldassi, Federica Gerace, Hilbert J. Kappen, Carlo Lucibello, Luca Saglietti, Enzo Tartaglione, Riccardo Zecchina

Stochasticity and limited precision of synaptic weights in neural network models are key aspects of both biological and hardware modeling of learning processes.

Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes

no code implementations20 May 2016 Carlo Baldassi, Christian Borgs, Jennifer Chayes, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, Riccardo Zecchina

We define a novel measure, which we call the "robust ensemble" (RE), which suppresses trapping by isolated configurations and amplifies the role of these dense regions.

Learning may need only a few bits of synaptic precision

no code implementations12 Feb 2016 Carlo Baldassi, Federica Gerace, Carlo Lucibello, Luca Saglietti, Riccardo Zecchina

Learning in neural networks poses peculiar challenges when using discretized rather then continuous synaptic states.

Local entropy as a measure for sampling solutions in Constraint Satisfaction Problems

no code implementations18 Nov 2015 Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, Riccardo Zecchina

We introduce a novel Entropy-driven Monte Carlo (EdMC) strategy to efficiently sample solutions of random Constraint Satisfaction Problems (CSPs).

Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses

no code implementations18 Sep 2015 Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, Riccardo Zecchina

We also show that the dense regions are surprisingly accessible by simple learning protocols, and that these synaptic configurations are robust to perturbations and generalize better than typical solutions.

Cannot find the paper you are looking for? You can Submit a new open access paper.