no code implementations • 19 Feb 2024 • David G. Clark, Manuel Beiran
To characterize this interaction, we develop a dynamical mean-field theory to analyze such networks in the limit where each region contains infinitely many neurons, with cross-region currents as key order parameters.
no code implementations • 17 Feb 2023 • David G. Clark, L. F. Abbott
In chaotic states with strong Hebbian plasticity, a stable fixed point of neuronal dynamics is destabilized by synaptic dynamics, allowing any neuronal state to be stored as a stable fixed point by halting the plasticity.
no code implementations • 25 Jul 2022 • David G. Clark, L. F. Abbott, Ashok Litwin-Kumar
Neural networks are high-dimensional nonlinear dynamical systems that process information through the coordinated activity of many connected units.
1 code implementation • NeurIPS 2021 • David G. Clark, L. F. Abbott, SueYeon Chung
We prove that these weight updates are matched in sign to the gradient, enabling accurate credit assignment.
1 code implementation • 23 May 2019 • David G. Clark, Jesse A. Livezey, Kristofer E. Bouchard
Linear dimensionality reduction methods are commonly used to extract low-dimensional structure from high-dimensional data.
no code implementations • 22 May 2018 • David G. Clark, Jesse A. Livezey, Edward F. Chang, Kristofer E. Bouchard
Neuromorphic architectures achieve low-power operation by using many simple spiking neurons in lieu of traditional hardware.