no code implementations • ICML 2020 • Stephen Keeley, David Zoltowski, Jonathan Pillow, Spencer Smith, Yiyi Yu
Gaussian Process Factor Analysis (GPFA) hasbeen broadly applied to the problem of identi-fying smooth, low-dimensional temporal struc-ture underlying large-scale neural recordings. However, spike trains are non-Gaussian, whichmotivates combining GPFA with discrete ob-servation models for binned spike count data. The drawback to this approach is that GPFApriors are not conjugate to count model like-lihoods, which makes inference challenging. Here we address this obstacle by introduc-ing a fast, approximate inference method fornon-conjugate GPFA models.
2 code implementations • 31 Oct 2023 • Antonis Antoniades, Yiyi Yu, Joseph Canzano, William Wang, Spencer LaVere Smith
State-of-the-art systems neuroscience experiments yield large-scale multimodal data, and these data sets require new tools for analysis.
no code implementations • NeurIPS 2020 • Stephen Keeley, Mikio Aoi, Yiyi Yu, Spencer Smith, Jonathan W. Pillow
Here we address this shortcoming by proposing ``signal-noise'' Poisson-spiking Gaussian Process Factor Analysis (SNP-GPFA), a flexible latent variable model that resolves signal and noise latent structure in neural population spiking activity.
no code implementations • 13 Sep 2019 • Yuming Huang, Ashkan Panahi, Hamid Krim, Yiyi Yu, Spencer L. Smith
We present a novel adversarial framework for training deep belief networks (DBNs), which includes replacing the generator network in the methodology of generative adversarial networks (GANs) with a DBN and developing a highly parallelizable numerical algorithm for training the resulting architecture in a stochastic manner.
no code implementations • 7 Jun 2019 • Stephen L. Keeley, David M. Zoltowski, Yiyi Yu, Jacob L. Yates, Spencer L. Smith, Jonathan W. Pillow
We demonstrate that PAL estimators achieve fast and accurate extraction of latent structure from multi-neuron spike train data.