no code implementations • 16 Feb 2024 • Tommaso Salvatori, Beren Millidge, Yuhang Song, Rafal Bogacz, Thomas Lukasiewicz
This problem can be easily solved by computing \emph{similarities} in an embedding space instead of the pixel space.
1 code implementation • NeurIPS 2023 • Mufeng Tang, Helen Barron, Rafal Bogacz
Forming accurate memory of sequential stimuli is a fundamental function of biological agents.
no code implementations • 16 Nov 2022 • Tommaso Salvatori, Yuhang Song, Yordan Yordanov, Beren Millidge, Zhenghua Xu, Lei Sha, Cornelius Emde, Rafal Bogacz, Thomas Lukasiewicz
Predictive coding networks are neuroscience-inspired models with roots in both Bayesian statistics and neuroscience.
1 code implementation • 21 Jul 2022 • Beren Millidge, Yuhang Song, Tommaso Salvatori, Thomas Lukasiewicz, Rafal Bogacz
In this paper, we provide a comprehensive theoretical analysis of the properties of PCNs trained with prospective configuration.
1 code implementation • 31 May 2022 • Beren Millidge, Yuhang Song, Tommaso Salvatori, Thomas Lukasiewicz, Rafal Bogacz
How the brain performs credit assignment is a fundamental unsolved problem in neuroscience.
no code implementations • 18 Feb 2022 • Beren Millidge, Tommaso Salvatori, Yuhang Song, Rafal Bogacz, Thomas Lukasiewicz
The backpropagation of error algorithm used to train deep neural networks has been fundamental to the successes of deep learning.
1 code implementation • 9 Feb 2022 • Beren Millidge, Tommaso Salvatori, Yuhang Song, Thomas Lukasiewicz, Rafal Bogacz
A large number of neural network models of associative memory have been proposed in the literature.
no code implementations • 31 Jan 2022 • Tommaso Salvatori, Luca Pinchetti, Beren Millidge, Yuhang Song, TianYi Bao, Rafal Bogacz, Thomas Lukasiewicz
Training with backpropagation (BP) in standard deep learning consists of two main steps: a forward pass that maps a data point to its prediction, and a backward pass that propagates the error of this prediction back through the network.
no code implementations • NeurIPS 2021 • Tommaso Salvatori, Yuhang Song, Yujian Hong, Simon Frieder, Lei Sha, Zhenghua Xu, Rafal Bogacz, Thomas Lukasiewicz
We conclude by discussing the possible impact of this work in the neuroscience community, by showing that our model provides a plausible framework to study learning and retrieval of memories in the brain, as it closely mimics the behavior of the hippocampus as a memory index and generative model.
no code implementations • 7 Jul 2021 • Mayela Zamora, Sebastian Meller, Filip Kajin, James J Sermon, Robert Toth, Moaad Benjaber, Derk-Jan Dijk, Rafal Bogacz, Gregory A Worrell, Antonio Valentin, Benoit Duchet, Holger A Volk, Timothy Denison
Such is the case of circadian and infradian seizure patterns observed in epilepsy.
no code implementations • 8 Mar 2021 • Tommaso Salvatori, Yuhang Song, Thomas Lukasiewicz, Rafal Bogacz, Zhenghua Xu
Recent works prove that these methods can approximate BP up to a certain margin on multilayer perceptrons (MLPs), and asymptotically on any other complex model, and that zero-divergence inference learning (Z-IL), a variant of PC, is able to exactly implement BP on MLPs.
no code implementations • 5 Mar 2021 • Tommaso Salvatori, Yuhang Song, Thomas Lukasiewicz, Rafal Bogacz, Zhenghua Xu
Predictive coding networks (PCNs) are an influential model for information processing in the brain.
no code implementations • NeurIPS 2020 • Yuhang Song, Thomas Lukasiewicz, Zhenghua Xu, Rafal Bogacz
However, there are several gaps between BP and learning in biologically plausible neuronal networks of the brain (learning in the brain, or simply BL, for short), in particular, (1) it has been unclear to date, if BP can be implemented exactly via BL, (2) there is a lack of local plasticity in BP, i. e., weight updates require information that is not locally available, while BL utilizes only locally available information, and (3)~there is a lack of autonomy in BP, i. e., some external control over the neural network is required (e. g., switching between prediction and learning stages requires changes to dynamics and synaptic plasticity rules), while BL works fully autonomously.