Search Results for author: Alessandro Ingrosso

Found 11 papers, 1 papers with code

Machine learning at the mesoscale: a computation-dissipation bottleneck

no code implementations5 Jul 2023 Alessandro Ingrosso, Emanuele Panizon

The cost of information processing in physical systems calls for a trade-off between performance and energetic expenditure.

Neural networks trained with SGD learn distributions of increasing complexity

1 code implementation21 Nov 2022 Maria Refinetti, Alessandro Ingrosso, Sebastian Goldt

The ability of deep neural networks to generalise well even when they interpolate their training data has been explained using various "simplicity biases".

Data-driven emergence of convolutional structure in neural networks

no code implementations1 Feb 2022 Alessandro Ingrosso, Sebastian Goldt

Here, we show how initially fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localised, space-tiling receptive fields.

Tensor Decomposition Translation

Input correlations impede suppression of chaos and learning in balanced rate networks

no code implementations24 Jan 2022 Rainer Engelken, Alessandro Ingrosso, Ramin Khajeh, Sven Goedeke, L. F. Abbott

To study this phenomenon we develop a non-stationary dynamic mean-field theory that determines how the activity statistics and largest Lyapunov exponent depend on frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input.

Optimal Learning with Excitatory and Inhibitory synapses

no code implementations25 May 2020 Alessandro Ingrosso

Characterizing the relation between weight structure and input/output statistics is fundamental for understanding the computational capabilities of neural circuits.

Training dynamically balanced excitatory-inhibitory networks

no code implementations29 Dec 2018 Alessandro Ingrosso, L. F. Abbott

The construction of biologically plausible models of neural circuits is crucial for understanding the computational properties of the nervous system.

Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes

no code implementations20 May 2016 Carlo Baldassi, Christian Borgs, Jennifer Chayes, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, Riccardo Zecchina

We define a novel measure, which we call the "robust ensemble" (RE), which suppresses trapping by isolated configurations and amplifies the role of these dense regions.

Discovering Neuronal Cell Types and Their Gene Expression Profiles Using a Spatial Point Process Mixture Model

no code implementations4 Feb 2016 Furong Huang, Animashree Anandkumar, Christian Borgs, Jennifer Chayes, Ernest Fraenkel, Michael Hawrylycz, Ed Lein, Alessandro Ingrosso, Srinivas Turaga

Single-cell RNA sequencing can now be used to measure the gene expression profiles of individual neurons and to categorize neurons based on their gene expression profiles.

Local entropy as a measure for sampling solutions in Constraint Satisfaction Problems

no code implementations18 Nov 2015 Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, Riccardo Zecchina

We introduce a novel Entropy-driven Monte Carlo (EdMC) strategy to efficiently sample solutions of random Constraint Satisfaction Problems (CSPs).

Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses

no code implementations18 Sep 2015 Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, Riccardo Zecchina

We also show that the dense regions are surprisingly accessible by simple learning protocols, and that these synaptic configurations are robust to perturbations and generalize better than typical solutions.

Cannot find the paper you are looking for? You can Submit a new open access paper.