no code implementations • 30 Jul 2024 • Jonathan D. McCart, Andrew R. Sedler, Christopher Versteeg, Domenick Mifsud, Mattia Rigotti-Thompson, Chethan Pandarinath
Here we propose a new approach to neural data analysis that leverages advances in conditional generative modeling to enable the unsupervised inference of disentangled behavioral variables from recorded neural activity.
no code implementations • 12 Sep 2023 • Christopher Versteeg, Andrew R. Sedler, Jonathan D. McCart, Chethan Pandarinath
Overall, ODIN's accuracy in recovering ground-truth latent features and ability to accurately reconstruct neural activity with low dimensionality make it a promising method for distilling interpretable dynamics that can help explain neural computation.
3 code implementations • 3 Sep 2023 • Andrew R. Sedler, Chethan Pandarinath
Latent factor analysis via dynamical systems (LFADS) is an RNN-based variational sequential autoencoder that achieves state-of-the-art performance in denoising high-dimensional neural activity for downstream applications in science and engineering.
1 code implementation • 7 Dec 2022 • Andrew R. Sedler, Christopher Versteeg, Chethan Pandarinath
Artificial neural networks that can recover latent dynamics from recorded neural activity may provide a powerful avenue for identifying and interpreting the dynamical motifs underlying biological computation.
1 code implementation • NeurIPS 2021 • Feng Zhu, Andrew R. Sedler, Harrison A. Grier, Nauman Ahad, Mark A. Davenport, Matthew T. Kaufman, Andrea Giovannucci, Chethan Pandarinath
We test SBTT applied to sequential autoencoders and demonstrate more efficient and higher-fidelity characterization of neural population dynamics in electrophysiological and calcium imaging data.