Search Results for author: Jonathan Pillow

Found 7 papers, 3 papers with code

Efficient non-conjugate Gaussian process factor models for spike countdata using polynomial approximations

no code implementations ICML 2020 Stephen Keeley, David Zoltowski, Jonathan Pillow, Spencer Smith, Yiyi Yu

Gaussian Process Factor Analysis (GPFA) hasbeen broadly applied to the problem of identi-fying smooth, low-dimensional temporal struc-ture underlying large-scale neural recordings. However, spike trains are non-Gaussian, whichmotivates combining GPFA with discrete ob-servation models for binned spike count data. The drawback to this approach is that GPFApriors are not conjugate to count model like-lihoods, which makes inference challenging. Here we address this obstacle by introduc-ing a fast, approximate inference method fornon-conjugate GPFA models.

Variational Inference

System Identification for Continuous-time Linear Dynamical Systems

no code implementations23 Aug 2023 Peter Halmos, Jonathan Pillow, David A. Knowles

This paper addresses system identification for the continuous-discrete filter, with the aim of generalizing learning for the Kalman filter by relying on a solution to a continuous-time It\^o stochastic differential equation (SDE) for the latent state and covariance dynamics.

Probing the relationship between linear dynamical systems and low-rank recurrent neural network models

no code implementations19 Oct 2021 Adrian Valente, Srdjan Ostojic, Jonathan Pillow

We show that latent LDS models can only be converted to RNNs in specific limit cases, due to the non-Markovian property of latent LDS models.

Fast shared response model for fMRI data

2 code implementations27 Sep 2019 Hugo Richard, Lucas Martin, Ana Luısa Pinho, Jonathan Pillow, Bertrand Thirion

The shared response model provides a simple but effective framework to analyse fMRI data of subjects exposed to naturalistic stimuli.

Bayesian Entropy Estimation for Countable Discrete Distributions

2 code implementations2 Feb 2013 Evan Archer, Il Memming Park, Jonathan Pillow

The Pitman-Yor process, a generalization of Dirichlet process, provides a tractable prior distribution over the space of countably infinite discrete distributions, and has found major applications in Bayesian non-parametric statistics and machine learning.

Information Theory Information Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.