Search Results for author: Daniel Durstewitz

Found 15 papers, 7 papers with code

Tractable Dendritic RNNs for Reconstructing Nonlinear Dynamical Systems

1 code implementation6 Jul 2022 Manuel Brenner, Florian Hess, Jonas M. Mikhaeil, Leonard Bereska, Zahra Monfared, Po-Chen Kuo, Daniel Durstewitz

In many scientific disciplines, we are interested in inferring the nonlinear dynamical system underlying a set of observed time series, a challenging task in the face of chaotic behavior and noise.

Time Series Time Series Analysis +1

Sparse convolutional coding for neuronal assembly detection

1 code implementation NeurIPS 2017 Sven Peter, Elke Kirschbaum, Martin Both, Lee Campbell, Brandon Harvey, Conor Heins, Daniel Durstewitz, Ferran Diego, Fred A. Hamprecht

Cell assemblies, originally proposed by Donald Hebb (1949), are subsets of neurons firing in a temporally coordinated way that gives rise to repeated motifs supposed to underly neural representations and information processing.

Generalized Teacher Forcing for Learning Chaotic Dynamics

1 code implementation7 Jun 2023 Florian Hess, Zahra Monfared, Manuel Brenner, Daniel Durstewitz

Here we report that a surprisingly simple modification of teacher forcing leads to provably strictly all-time bounded gradients in training on chaotic systems, and, when paired with a simple architectural rearrangement of a tractable RNN design, piecewise-linear RNNs (PLRNNs), allows for faithful reconstruction in spaces of at most the dimensionality of the observed system.

On the difficulty of learning chaotic dynamics with RNNs

1 code implementation14 Oct 2021 Jonas M. Mikhaeil, Zahra Monfared, Daniel Durstewitz

Here we offer a comprehensive theoretical treatment of this problem by relating the loss gradients during RNN training to the Lyapunov spectrum of RNN-generated orbits.

Time Series Time Series Analysis

LeMoNADe: Learned Motif and Neuronal Assembly Detection in calcium imaging videos

1 code implementation ICLR 2019 Elke Kirschbaum, Manuel Haußmann, Steffen Wolf, Hannah Jakobi, Justus Schneider, Shehabeldin Elzoheiry, Oliver Kann, Daniel Durstewitz, Fred A. Hamprecht

Neuronal assemblies, loosely defined as subsets of neurons with reoccurring spatio-temporally coordinated activation patterns, or "motifs", are thought to be building blocks of neural representations and information processing.

Reconstructing Nonlinear Dynamical Systems from Multi-Modal Time Series

1 code implementation4 Nov 2021 Daniel Kramer, Philine Lou Bommer, Carlo Tombolini, Georgia Koppe, Daniel Durstewitz

Here we propose a general framework for multi-modal data integration for the purpose of nonlinear DS reconstruction and the analysis of cross-modal relations.

Data Integration Time Series +2

Cell assemblies at multiple time scales with arbitrary lag constellations

no code implementations4 Jul 2016 Eleonora Russo, Daniel Durstewitz

Hebb's idea of a cell assembly as the fundamental unit of neural information processing has dominated neuroscience like no other theoretical concept within the past 60 years.

A State Space Approach for Piecewise-Linear Recurrent Neural Networks for Reconstructing Nonlinear Dynamics from Neural Measurements

no code implementations23 Dec 2016 Daniel Durstewitz

In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover the relevant dynamics underlying observed neuronal time series, and directly link them to computational properties.

Time Series Time Series Analysis

Identifying nonlinear dynamical systems via generative recurrent neural networks with applications to fMRI

no code implementations19 Feb 2019 Georgia Koppe, Hazem Toutounji, Peter Kirsch, Stefanie Lis, Daniel Durstewitz

A major tenet in theoretical neuroscience is that cognitive and behavioral processes are ultimately implemented in terms of the neural system dynamics.

Time Series Time Series Analysis

Transformation of ReLU-based recurrent neural networks from discrete-time to continuous-time

1 code implementation ICML 2020 Zahra Monfared, Daniel Durstewitz

On the other hand, mathematical analysis of dynamical systems inferred from data is often more convenient and enables additional insights if these are formulated in continuous time, i. e. as systems of ordinary (or partial) differential equations (ODE).

BIG-bench Machine Learning Numerical Integration +3

Tractable Dendritic RNNs for Identifying Unknown Nonlinear Dynamical Systems

no code implementations29 Sep 2021 Manuel Brenner, Leonard Bereska, Jonas Magdy Mikhaeil, Florian Hess, Zahra Monfared, Po-Chen Kuo, Daniel Durstewitz

In many scientific disciplines, we are interested in inferring the nonlinear dynamical system underlying a set of observed time series, a challenging task in the face of chaotic behavior and noise.

Time Series Time Series Analysis +1

Inferring Dynamical Systems with Long-Range Dependencies through Line Attractor Regularization

no code implementations25 Sep 2019 Dominik Schmidt, Georgia Koppe, Max Beutelspacher, Daniel Durstewitz

Vanilla RNN with ReLU activation have a simple structure that is amenable to systematic dynamical systems analysis and interpretation, but they suffer from the exploding vs. vanishing gradients problem.

Out-of-Domain Generalization in Dynamical Systems Reconstruction

no code implementations28 Feb 2024 Niclas Göring, Florian Hess, Manuel Brenner, Zahra Monfared, Daniel Durstewitz

We explain why and how out-of-domain (OOD) generalization (OODG) in DSR profoundly differs from OODG considered elsewhere in machine learning.

Domain Generalization

Cannot find the paper you are looking for? You can Submit a new open access paper.