no code implementations • 24 Sep 2024 • Lucas Shoji, Kenta Suzuki, Leo Kozachkov
This paper shows that a wide class of effective learning rules -- those that improve a scalar performance measure over a given time window -- can be rewritten as natural gradient descent with respect to a suitably defined loss function and metric.
no code implementations • 14 Nov 2023 • Leo Kozachkov, Jean-Jacques Slotine, Dmitry Krotov
In their known biological implementations the ratio of stored memories to the number of neurons remains constant, despite the growth of the network size.
no code implementations • 2 Oct 2023 • Michaela Ennis, Leo Kozachkov, Jean-Jacques Slotine
To push forward the important emerging research field surrounding multi-area recurrent neural networks (RNNs), we expand theoretically and empirically on the provably stable RNNs of RNNs introduced by Kozachkov et al. in "RNNs of RNNs: Recursive Construction of Stable Assemblies of Recurrent Neural Networks".
1 code implementation • NeurIPS 2023 • Mitchell Ostrow, Adam Eisen, Leo Kozachkov, Ila Fiete
To bridge this gap, we introduce a novel similarity metric that compares two systems at the level of their dynamics, called Dynamical Similarity Analysis (DSA).
no code implementations • 17 Jan 2022 • Leo Kozachkov, Patrick M. Wensing, Jean-Jacques Slotine
We prove that Riemannian contraction in a supervised learning setting implies generalization.
1 code implementation • 16 Jun 2021 • Leo Kozachkov, Michaela Ennis, Jean-Jacques Slotine
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural activity.
no code implementations • 13 Feb 2017 • Leo Kozachkov, Konstantinos P. Michmizos
Finding the origin of slow and infra-slow oscillations could reveal or explain brain mechanisms in health and disease.