1 code implementation • ICML 2020 • Zahra Monfared, Daniel Durstewitz
On the other hand, mathematical analysis of dynamical systems inferred from data is often more convenient and enables additional insights if these are formulated in continuous time, i. e. as systems of ordinary (or partial) differential equations (ODE).
no code implementations • 28 Feb 2024 • Niclas Göring, Florian Hess, Manuel Brenner, Zahra Monfared, Daniel Durstewitz
We explain why and how out-of-domain (OOD) generalization (OODG) in DSR profoundly differs from OODG considered elsewhere in machine learning.
1 code implementation • 7 Jun 2023 • Florian Hess, Zahra Monfared, Manuel Brenner, Daniel Durstewitz
Here we report that a surprisingly simple modification of teacher forcing leads to provably strictly all-time bounded gradients in training on chaotic systems, and, when paired with a simple architectural rearrangement of a tractable RNN design, piecewise-linear RNNs (PLRNNs), allows for faithful reconstruction in spaces of at most the dimensionality of the observed system.
no code implementations • 15 Dec 2022 • Manuel Brenner, Florian Hess, Georgia Koppe, Daniel Durstewitz
Many, if not most, systems of interest in science are naturally described as nonlinear dynamical systems.
1 code implementation • 6 Jul 2022 • Manuel Brenner, Florian Hess, Jonas M. Mikhaeil, Leonard Bereska, Zahra Monfared, Po-Chen Kuo, Daniel Durstewitz
In many scientific disciplines, we are interested in inferring the nonlinear dynamical system underlying a set of observed time series, a challenging task in the face of chaotic behavior and noise.
1 code implementation • 4 Nov 2021 • Daniel Kramer, Philine Lou Bommer, Carlo Tombolini, Georgia Koppe, Daniel Durstewitz
Here we propose a general framework for multi-modal data integration for the purpose of nonlinear DS reconstruction and the analysis of cross-modal relations.
1 code implementation • 14 Oct 2021 • Jonas M. Mikhaeil, Zahra Monfared, Daniel Durstewitz
Here we offer a comprehensive theoretical treatment of this problem by relating the loss gradients during RNN training to the Lyapunov spectrum of RNN-generated orbits.
no code implementations • 29 Sep 2021 • Manuel Brenner, Leonard Bereska, Jonas Magdy Mikhaeil, Florian Hess, Zahra Monfared, Po-Chen Kuo, Daniel Durstewitz
In many scientific disciplines, we are interested in inferring the nonlinear dynamical system underlying a set of observed time series, a challenging task in the face of chaotic behavior and noise.
no code implementations • ICLR 2021 • Dominik Schmidt, Georgia Koppe, Zahra Monfared, Max Beutelspacher, Daniel Durstewitz
A main theoretical interest in biology and physics is to identify the nonlinear dynamical system (DS) that generated observed time series.
no code implementations • 25 Sep 2019 • Dominik Schmidt, Georgia Koppe, Max Beutelspacher, Daniel Durstewitz
Vanilla RNN with ReLU activation have a simple structure that is amenable to systematic dynamical systems analysis and interpretation, but they suffer from the exploding vs. vanishing gradients problem.
no code implementations • 19 Feb 2019 • Georgia Koppe, Hazem Toutounji, Peter Kirsch, Stefanie Lis, Daniel Durstewitz
A major tenet in theoretical neuroscience is that cognitive and behavioral processes are ultimately implemented in terms of the neural system dynamics.
1 code implementation • ICLR 2019 • Elke Kirschbaum, Manuel Haußmann, Steffen Wolf, Hannah Jakobi, Justus Schneider, Shehabeldin Elzoheiry, Oliver Kann, Daniel Durstewitz, Fred A. Hamprecht
Neuronal assemblies, loosely defined as subsets of neurons with reoccurring spatio-temporally coordinated activation patterns, or "motifs", are thought to be building blocks of neural representations and information processing.
1 code implementation • NeurIPS 2017 • Sven Peter, Elke Kirschbaum, Martin Both, Lee Campbell, Brandon Harvey, Conor Heins, Daniel Durstewitz, Ferran Diego, Fred A. Hamprecht
Cell assemblies, originally proposed by Donald Hebb (1949), are subsets of neurons firing in a temporally coordinated way that gives rise to repeated motifs supposed to underly neural representations and information processing.
no code implementations • 23 Dec 2016 • Daniel Durstewitz
In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover the relevant dynamics underlying observed neuronal time series, and directly link them to computational properties.
no code implementations • 4 Jul 2016 • Eleonora Russo, Daniel Durstewitz
Hebb's idea of a cell assembly as the fundamental unit of neural information processing has dominated neuroscience like no other theoretical concept within the past 60 years.