no code implementations • 14 Nov 2023 • Leo Kozachkov, Jean-Jacques Slotine, Dmitry Krotov
Such multi-neuron synapses are ubiquitous in models of Dense Associative Memory (also known as Modern Hopfield Networks) and are known to lead to superlinear memory storage capacity, which is a desirable computational feature.
no code implementations • 2 Oct 2023 • Michaela Ennis, Leo Kozachkov, Jean-Jacques Slotine
To push forward the important emerging research field surrounding multi-area recurrent neural networks (RNNs), we expand theoretically and empirically on the provably stable RNNs of RNNs introduced by Kozachkov et al. in "RNNs of RNNs: Recursive Construction of Stable Assemblies of Recurrent Neural Networks".
1 code implementation • NeurIPS 2023 • Mitchell Ostrow, Adam Eisen, Leo Kozachkov, Ila Fiete
To bridge this gap, we introduce a novel similarity metric that compares two systems at the level of their dynamics, called Dynamical Similarity Analysis (DSA).
no code implementations • 17 Jan 2022 • Leo Kozachkov, Patrick M. Wensing, Jean-Jacques Slotine
We prove that Riemannian contraction in a supervised learning setting implies generalization.
1 code implementation • 16 Jun 2021 • Leo Kozachkov, Michaela Ennis, Jean-Jacques Slotine
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural activity.
no code implementations • 13 Feb 2017 • Leo Kozachkov, Konstantinos P. Michmizos
Finding the origin of slow and infra-slow oscillations could reveal or explain brain mechanisms in health and disease.