Search Results for author: Liam Paninski

Found 43 papers, 20 papers with code

Information Rates and Optimal Decoding in Large Neural Populations

no code implementations NeurIPS 2011 Kamiar R. Rad, Liam Paninski

Many fundamental questions in theoretical neuroscience involve optimal decoding and the computation of Shannon information rates in populations of spiking neurons.

Exact Hamiltonian Monte Carlo for Truncated Multivariate Gaussians

1 code implementation20 Aug 2012 Ari Pakman, Liam Paninski

We present a Hamiltonian Monte Carlo algorithm to sample from multivariate Gaussian distributions in which the target space is constrained by linear and quadratic inequalities or products thereof.

Computation Applications

Bayesian spike inference from calcium imaging data

5 code implementations27 Nov 2013 Eftychios A. Pnevmatikakis, Josh Merel, Ari Pakman, Liam Paninski

We present efficient Bayesian methods for extracting neuronal spiking information from calcium imaging data.

Neurons and Cognition Quantitative Methods Applications

A multi-agent control framework for co-adaptation in brain-computer interfaces

no code implementations NeurIPS 2013 Josh S. Merel, Roy Fox, Tony Jebara, Liam Paninski

In a closed-loop brain-computer interface (BCI), adaptive decoders are used to learn parameters suited to decoding the user's neural response.

Brain Computer Interface

Sparse nonnegative deconvolution for compressive calcium imaging: algorithms and phase transitions

no code implementations NeurIPS 2013 Eftychios A. Pnevmatikakis, Liam Paninski

We propose a compressed sensing (CS) calcium imaging framework for monitoring large neuronal populations, where we image randomized projections of the spatial calcium concentration at each timestep, instead of measuring the concentration at individual locations.

Time Series Time Series Analysis

Robust learning of low-dimensional dynamics from large neural ensembles

no code implementations NeurIPS 2013 David Pfau, Eftychios A. Pnevmatikakis, Liam Paninski

We show on model data that the parameters of latent linear dynamical systems can be recovered, and that even if the dynamics are not stationary we can still recover the true latent subspace.

Dimensionality Reduction

A structured matrix factorization framework for large scale calcium imaging data analysis

11 code implementations9 Sep 2014 Eftychios A. Pnevmatikakis, Yuanjun Gao, Daniel Soudry, David Pfau, Clay Lacefield, Kira Poskanzer, Randy Bruno, Rafael Yuste, Liam Paninski

We present a structured matrix factorization approach to analyzing calcium imaging recordings of large neuronal ensembles.

Neurons and Cognition Quantitative Methods Applications

Clustered factor analysis of multineuronal spike data

no code implementations NeurIPS 2014 Lars Buesing, Timothy A. Machado, John P. Cunningham, Liam Paninski

High-dimensional, simultaneous recordings of neural spiking activity are often explored, analyzed and visualized with the help of latent variable or factor models.

Clustering Variational Inference

Neuroprosthetic decoder training as imitation learning

no code implementations13 Nov 2015 Josh Merel, David Carlson, Liam Paninski, John P. Cunningham

We describe how training a decoder in this way is a novel variant of an imitation learning problem, where an oracle or expert is employed for supervised training in lieu of direct observations, which are not available.

Brain Computer Interface Imitation Learning

Black box variational inference for state space models

no code implementations23 Nov 2015 Evan Archer, Il Memming Park, Lars Buesing, John Cunningham, Liam Paninski

These models have the advantage of learning latent structure both from noisy observations and from the temporal ordering in the data, where it is assumed that meaningful correlation structure exists across time.

Time Series Time Series Analysis +1

Partition Functions from Rao-Blackwellized Tempered Sampling

no code implementations7 Mar 2016 David Carlson, Patrick Stinson, Ari Pakman, Liam Paninski

Partition functions of probability distributions are important quantities for model evaluation and comparisons.

Efficient and accurate extraction of in vivo calcium signals from microendoscopic video data

8 code implementations24 May 2016 Pengcheng Zhou, Shanna L. Resendez, Jose Rodriguez-Romaguera, Jessica C. Jimenez, Shay Q. Neufeld, Garret D. Stuber, Rene Hen, Mazen A. Kheirbek, Bernardo L. Sabatini, Robert E. Kass, Liam Paninski

In vivo calcium imaging through microscopes has enabled deep brain imaging of previously inaccessible neuronal populations within the brains of freely moving subjects.

Linear dynamical neural population models through nonlinear embeddings

no code implementations NeurIPS 2016 Yuanjun Gao, Evan Archer, Liam Paninski, John P. Cunningham

A body of recent work in modeling neural activity focuses on recovering low-dimensional latent features that capture the statistical structure of large-scale neural populations.

Variational Inference

Robust and scalable Bayesian analysis of spatial neural tuning function data

no code implementations24 Jun 2016 Kamiar Rahnama Rad, Timothy A. Machado, Liam Paninski

On the other hand, sharing information between adjacent neurons can errantly degrade estimates of tuning functions across space if there are sharp discontinuities in tuning between nearby neurons.

Stochastic Bouncy Particle Sampler

1 code implementation ICML 2017 Ari Pakman, Dar Gilboa, David Carlson, Liam Paninski

We introduce a novel stochastic version of the non-reversible, rejection-free Bouncy Particle Sampler (BPS), a Markov process whose sample trajectories are piecewise linear.

Recurrent switching linear dynamical systems

1 code implementation26 Oct 2016 Scott W. Linderman, Andrew C. Miller, Ryan P. Adams, David M. Blei, Liam Paninski, Matthew J. Johnson

Many natural systems, such as neurons firing in the brain or basketball teams traversing a court, give rise to time series data with complex, nonlinear dynamics.

Bayesian Inference Time Series +1

Fast Active Set Methods for Online Spike Inference from Calcium Imaging

no code implementations NeurIPS 2016 Johannes Friedrich, Liam Paninski

Fluorescent calcium indicators are a popular means for observing the spiking activity of large neuronal populations.

Time Series Time Series Analysis

Reparameterizing the Birkhoff Polytope for Variational Permutation Inference

no code implementations26 Oct 2017 Scott W. Linderman, Gonzalo E. Mena, Hal Cooper, Liam Paninski, John P. Cunningham

Many matching, tracking, sorting, and ranking problems require probabilistic reasoning about possible permutations, a set that grows factorially with dimension.

Bayesian Inference Combinatorial Optimization +1

Quantifying the behavioral dynamics of C. elegans with autoregressive hidden Markov models

1 code implementation1 Dec 2017 E. Kelly Buchanan, Akiva Lipshitz, Scott Linderman, Liam Paninski

In order to fully understand the neural activity of Caenorhabditis elegans, we need a rich, quantitative description of the behavioral outputs it gives rise to.

Neural Networks for Efficient Bayesian Decoding of Natural Images from Retinal Neurons

1 code implementation NeurIPS 2017 Nikhil Parthasarathy, Eleanor Batty, William Falcon, Thomas Rutten, Mohit Rajpal, E.J. Chichilnisky, Liam Paninski

Decoding sensory stimuli from neural signals can be used to reveal how we sense our physical environment, and is valuable for the design of brain-machine interfaces.

Bayesian Inference

Scalable approximate Bayesian inference for particle tracking data

1 code implementation ICML 2018 Ruoxi Sun, Liam Paninski

This approach is therefore highly flexible and improves on the state of the art in terms of accuracy; provides uncertainty estimates about the particle locations and identities; and has a test run-time that scales linearly as a function of the data length and number of particles, thus enabling Bayesian inference in arbitrarily large particle tracking datasets.

Bayesian Inference

Nonlinear Evolution via Spatially-Dependent Linear Dynamics for Electrophysiology and Calcium Data

no code implementations6 Nov 2018 Daniel Hernandez, Antonio Khalil Moretti, Ziqiang Wei, Shreya Saxena, John Cunningham, Liam Paninski

We present Variational Inference for Nonlinear Dynamics (VIND), a variational inference framework that is able to uncover nonlinear, smooth latent dynamics from sequential data.

Time Series Time Series Analysis +1

Amortized Bayesian inference for clustering models

1 code implementation24 Nov 2018 Ari Pakman, Liam Paninski

We develop methods for efficient amortized approximate Bayesian inference over posterior distributions of probabilistic clustering models, such as Dirichlet process mixture models.

Bayesian Inference Clustering

Neural Clustering Processes

5 code implementations ICML 2020 Ari Pakman, Yueqi Wang, Catalin Mitelut, JinHyung Lee, Liam Paninski

Probabilistic clustering models (or equivalently, mixture models) are basic building blocks in countless statistical models and involve latent random variables over discrete spaces.

Bayesian Inference Clustering +1

Spike Sorting using the Neural Clustering Process

1 code implementation NeurIPS Workshop Neuro_AI 2019 Yueqi Wang, Ari Pakman, Catalin Mitelut, JinHyung Lee, Liam Paninski

We present a novel approach to spike sorting for high-density multielectrode probes using the Neural Clustering Process (NCP), a recently introduced neural architecture that performs scalable amortized approximate Bayesian inference for efficient probabilistic clustering.

Bayesian Inference Clustering +1

Sinkhorn Permutation Variational Marginal Inference

no code implementations pproximateinference AABI Symposium 2019 Gonzalo Mena, Erdem Varol, Amin Nejatbakhsh, Eviatar Yemini, Liam Paninski

This problem is known to quickly become intractable as the size of the permutation increases, since its involves the computation of the permanent of a matrix, a #P-hard problem.

Neural Permutation Processes

no code implementations pproximateinference AABI Symposium 2019 Ari Pakman, Yueqi Wang, Liam Paninski

We introduce a neural architecture to perform amortized approximate Bayesian inference over latent random permutations of two sets of objects.

Bayesian Inference

Scalable Bayesian inference of dendritic voltage via spatiotemporal recurrent state space models

no code implementations NeurIPS 2019 Ruoxi Sun, Ian Kinsella, Scott Linderman, Liam Paninski

However, current sensors and imaging approaches still face significant limitations in SNR and sampling frequency; therefore statistical denoising and interpolation methods remain critical for understanding single-trial spatiotemporal dendritic voltage dynamics.

Bayesian Inference Denoising

Linear-time inference for Gaussian Processes on one dimension

no code implementations11 Mar 2020 Jackson Loper, David Blei, John P. Cunningham, Liam Paninski

Gaussian Processes (GPs) provide powerful probabilistic frameworks for interpolation, forecasting, and smoothing, but have been hampered by computational scaling issues.

Gaussian Processes Time Series +1

Disentangled Sticky Hierarchical Dirichlet Process Hidden Markov Model

1 code implementation6 Apr 2020 Ding Zhou, Yuanjun Gao, Liam Paninski

The Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) has been used widely as a natural Bayesian nonparametric extension of the classical Hidden Markov Model for learning from sequential and time-series data.

Time Series Time Series Analysis

A zero-inflated gamma model for deconvolved calcium imaging traces

1 code implementation5 Jun 2020 Xue-Xin Wei, Ding Zhou, Andres Grosmark, Zaki Ajabi, Fraser Sparks, Pengcheng Zhou, Mark Brandon, Attila Losonczy, Liam Paninski

However, statistical modeling of deconvolved calcium signals (i. e., the estimated activity extracted by a pre-processing pipeline) is just as critical for interpreting calcium measurements, and for incorporating these observations into downstream probabilistic encoding and decoding models.

Denoising

Amortized Probabilistic Detection of Communities in Graphs

2 code implementations29 Oct 2020 Yueqi Wang, Yoonho Lee, Pallab Basu, Juho Lee, Yee Whye Teh, Liam Paninski, Ari Pakman

While graph neural networks (GNNs) have been successful in encoding graph structures, existing GNN-based methods for community detection are limited by requiring knowledge of the number of communities in advance, in addition to lacking a proper probabilistic formulation to handle uncertainty.

Clustering Community Detection

Three-dimensional spike localization and improved motion correction for Neuropixels recordings

no code implementations NeurIPS 2021 Julien Boussard, Erdem Varol, Hyun Dong Lee, Nishchal Dethe, Liam Paninski

Neuropixels (NP) probes are dense linear multi-electrode arrays that have rapidly become essential tools for studying the electrophysiology of large neural populations.

Denoising Spike Sorting

SemiMultiPose: A Semi-supervised Multi-animal Pose Estimation Framework

no code implementations14 Apr 2022 Ari Blau, Christoph Gebhardt, Andres Bendesky, Liam Paninski, Anqi Wu

Multi-animal pose estimation is essential for studying animals' social behaviors in neuroscience and neuroethology.

Animal Pose Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.