Search Results for author: Scott Linderman

Found 13 papers, 6 papers with code

Inferring Inference

1 code implementation4 Oct 2023 Rajkumar Vasudeva Raju, Zhe Li, Scott Linderman, Xaq Pitkow

Given a time series of neural activity during a perceptual inference task, our framework finds (i) the neural representation of relevant latent variables, (ii) interactions between these variables that define the brain's internal model of the world, and (iii) message-functions specifying the inference algorithm.

Experimental Design

SIXO: Smoothing Inference with Twisted Objectives

1 code implementation13 Jun 2022 Dieterich Lawson, Allan Raventós, Andrew Warrington, Scott Linderman

Sequential Monte Carlo (SMC) is an inference algorithm for state space models that approximates the posterior by sampling from a sequence of target distributions.

Density Ratio Estimation

Streaming Inference for Infinite Non-Stationary Clustering

no code implementations2 May 2022 Rylan Schaeffer, Gabrielle Kaili-May Liu, Yilun Du, Scott Linderman, Ila Rani Fiete

Learning from a continuous stream of non-stationary data in an unsupervised manner is arguably one of the most common and most challenging settings facing intelligent agents.

Clustering Variational Inference

Bayesian recurrent state space model for rs-fMRI

no code implementations14 Nov 2020 Arunesh Mittal, Scott Linderman, John Paisley, Paul Sajda

We evaluate our method on the ADNI2 dataset by inferring latent state patterns corresponding to altered neural circuits in individuals with Mild Cognitive Impairment (MCI).

Mutually Regressive Point Processes

1 code implementation NeurIPS 2019 Ifigeneia Apostolopoulou, Scott Linderman, Kyle Miller, Artur Dubrawski

Despite many potential applications, existing point process models are limited in their ability to capture complex patterns of interaction.

Bayesian Inference Point Processes

Scalable Bayesian inference of dendritic voltage via spatiotemporal recurrent state space models

no code implementations NeurIPS 2019 Ruoxi Sun, Ian Kinsella, Scott Linderman, Liam Paninski

However, current sensors and imaging approaches still face significant limitations in SNR and sampling frequency; therefore statistical denoising and interpolation methods remain critical for understanding single-trial spatiotemporal dendritic voltage dynamics.

Bayesian Inference Denoising

Point process latent variable models of larval zebrafish behavior

no code implementations NeurIPS 2018 Anuj Sharma, Robert Johnson, Florian Engert, Scott Linderman

However, these sequences of swim bouts belie a set of discrete and continuous internal states, latent variables that are not captured by standard point process models.

Variational Inference

Learning Latent Permutations with Gumbel-Sinkhorn Networks

2 code implementations ICLR 2018 Gonzalo Mena, David Belanger, Scott Linderman, Jasper Snoek

Permutations and matchings are core building blocks in a variety of latent variable models, as they allow us to align, canonicalize, and sort data.

Quantifying the behavioral dynamics of C. elegans with autoregressive hidden Markov models

1 code implementation1 Dec 2017 E. Kelly Buchanan, Akiva Lipshitz, Scott Linderman, Liam Paninski

In order to fully understand the neural activity of Caenorhabditis elegans, we need a rich, quantitative description of the behavioral outputs it gives rise to.

Dependent Multinomial Models Made Easy: Stick-Breaking with the Polya-gamma Augmentation

no code implementations NeurIPS 2015 Scott Linderman, Matthew Johnson, Ryan P. Adams

For example, nucleotides in a DNA sequence, children's names in a given state and year, and text documents are all commonly modeled with multinomial distributions.

Bayesian Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.