no code implementations • 7 Oct 2024 • Ayesha Vermani, Josue Nassar, Hyungju Jeon, Matthew Dowling, Il Memming Park
However, there has been limited work exploiting the shared structure in neural activity during similar tasks for learning latent dynamics from neural recordings.
no code implementations • 2 Sep 2024 • Ayesha Vermani, Matthew Dowling, Hyungju Jeon, Ian Jordan, Josue Nassar, Yves Bernaerts, Yuan Zhao, Steven Van Vaerenbergh, Il Memming Park
We emphasize the importance of large-scale integrative neuroscience initiatives and the role of meta-learning in overcoming these challenges.
1 code implementation • 21 Nov 2022 • Lyndon R. Duong, Jingyang Zhou, Josue Nassar, Jules Berman, Jeroen Olieslagers, Alex H. Williams
Quantifying similarity between neural representations -- e. g. hidden layer activation vectors -- is a perennial problem in deep learning and neuroscience research.
no code implementations • 4 Feb 2022 • Josue Nassar, Jennifer Brennan, Ben Evans, Kendall Lowrey
Online learning via Bayes' theorem allows new data to be continuously integrated into an agent's current beliefs.
no code implementations • ICLR 2022 • Josue Nassar, Jennifer Rogers Brennan, Ben Evans, Kendall Lowrey
Online learning via Bayes' theorem allows new data to be continuously integrated into an agent's current beliefs.
1 code implementation • NeurIPS 2020 • Josue Nassar, Piotr Aleksander Sokol, SueYeon Chung, Kenneth D. Harris, Il Memming Park
In this work, we investigate the latter by juxtaposing experimental results regarding the covariance spectrum of neural representations in the mouse V1 (Stringer et al) with artificial neural networks.
1 code implementation • 4 Jun 2019 • Yuan Zhao, Josue Nassar, Ian Jordan, Mónica Bugallo, Il Memming Park
Nonlinear state-space models are powerful tools to describe dynamical structures in complex time series.
1 code implementation • ICLR 2019 • Josue Nassar, Scott W. Linderman, Monica Bugallo, Il Memming Park
Many real-world systems studied are governed by complex, nonlinear dynamics.