no code implementations • 7 Sep 2023 • Marino Pagan, Adrian Valente, Srdjan Ostojic, Carlos D. Brody
Linearization of the dynamics of recurrent neural networks (RNNs) is often used to study their properties.
no code implementations • 31 Aug 2023 • Srdjan Ostojic, Stefano Fusi
One major challenge of neuroscience is finding interesting structures in a seemingly disorganized neural activity.
1 code implementation • 14 Jul 2023 • Friedrich Schuessler, Francesca Mastrogiuseppe, Srdjan Ostojic, Omri Barak
Here, we utilize recurrent neural networks (RNNs) to explore the question of when and how neural dynamics and the network's output are related from a geometrical point of view.
no code implementations • 19 Oct 2021 • Adrian Valente, Srdjan Ostojic, Jonathan Pillow
We show that latent LDS models can only be converted to RNNs in specific limit cases, due to the non-Markovian property of latent LDS models.
no code implementations • 8 Jul 2021 • Mehrdad Jazayeri, Srdjan Ostojic
The ongoing exponential rise in recording capacity calls for new approaches for analysing and interpreting neural data.
1 code implementation • NeurIPS 2020 • Friedrich Schuessler, Francesca Mastrogiuseppe, Alexis Dubreuil, Srdjan Ostojic, Omri Barak
Recurrent neural networks (RNNs) trained on low-dimensional tasks have been widely used to model functional biological networks.
no code implementations • NeurIPS Workshop Neuro_AI 2019 • Alexis M Dubreuil, Adrian Valente, Francesca Mastrogiuseppe, Srdjan Ostojic
In these networks, the rank of the connectivity controls the dimensionality of the dynamics, while the number of components in the Gaussian mixture corresponds to the number of cell classes.