Correlation-invariant synaptic plasticity

21 May 2021  ·  Carlos S. N. Brito, Wulfram Gerstner ·

Cortical neurons develop receptive fields adapted to the statistics of the environment. Synaptic plasticity models reproduce some of these response properties, but so far require unrealistic assumptions about the statistics of the incoming sensory signals, such as decorrelated inputs with identical firing rates. Here we develop a theory for synaptic plasticity that is invariant to second-order correlations in the input. Going beyond classical Hebbian learning, we show that metaplastic long-term depression cancels the sensitivity to second-order correlation, bringing out selectivity to higher-order statistics. In contrast, alternative stabilization mechanisms, such as heterosynaptic depression, increase the sensitivity to input correlations. Our simulations demonstrate how correlation-invariant plasticity models can learn latent patterns despite perturbations in input statistics without the need for whitening. The theory advances our understanding of local unsupervised learning in cortical circuits and assigns a precise functional role to synaptic depression mechanisms in pyramidal neurons.

PDF Abstract
No code implementations yet. Submit your code now



  Add Datasets introduced or used in this paper

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.


No methods listed for this paper. Add relevant methods here