Search Results for author: Scott W. Linderman

Found 24 papers, 17 papers with code

Towards a theory of learning dynamics in deep state space models

no code implementations10 Jul 2024 Jakub Smékal, Jimmy T. H. Smith, Michael Kleinman, Dan Biderman, Scott W. Linderman

State space models (SSMs) have shown remarkable empirical performance on many long sequence modeling tasks, but a theoretical understanding of these models is still lacking.

Switching Autoregressive Low-rank Tensor Models

1 code implementation NeurIPS 2023 Hyun Dong Lee, Andrew Warrington, Joshua I. Glaser, Scott W. Linderman

In contrast, SLDSs can capture long-range dependencies in a parameter efficient way through Markovian latent dynamics, but present an intractable likelihood and a challenging parameter estimation task.

Time Series Analysis

Revisiting Structured Variational Autoencoders

no code implementations25 May 2023 Yixiu Zhao, Scott W. Linderman

Here, we revisit SVAEs using modern machine learning tools and demonstrate their advantages over more general alternatives in terms of both accuracy and efficiency.

Simplified State Space Layers for Sequence Modeling

3 code implementations9 Aug 2022 Jimmy T. H. Smith, Andrew Warrington, Scott W. Linderman

Models using structured state space sequence (S4) layers have achieved state-of-the-art performance on long-range sequence modeling tasks.

Computational Efficiency ListOps +4

Spatiotemporal Clustering with Neyman-Scott Processes via Connections to Bayesian Nonparametric Mixture Models

1 code implementation13 Jan 2022 Yixin Wang, Anthony Degleris, Alex H. Williams, Scott W. Linderman

This construction is similar to Bayesian nonparametric mixture models like the Dirichlet process mixture model (DPMM) in that the number of latent events (i. e. clusters) is a random variable, but the point process formulation makes the NSP especially well suited to modeling spatiotemporal data.

Bayesian Inference Clustering +1

Reverse engineering recurrent neural networks with Jacobian switching linear dynamical systems

1 code implementation NeurIPS 2021 Jimmy T. H. Smith, Scott W. Linderman, David Sussillo

The results are a trained SLDS variant that closely approximates the RNN, an auxiliary function that can produce a fixed point for each point in state-space, and a trained nonlinear RNN whose dynamics have been regularized such that its first-order terms perform the computation, if possible.

Time Series Analysis

Generalized Shape Metrics on Neural Representations

2 code implementations NeurIPS 2021 Alex H. Williams, Erin Kunz, Simon Kornblith, Scott W. Linderman

In doing so, we identify relationships between neural representations that are interpretable in terms of anatomical features and model performance.

Statistical Neuroscience in the Single Trial Limit

no code implementations8 Mar 2021 Alex H. Williams, Scott W. Linderman

Individual neurons often produce highly variable responses over nominally identical trials, reflecting a mixture of intrinsic "noise" and systematic changes in the animal's cognitive and behavioral state.

Unifying and generalizing models of neural dynamics during decision-making

1 code implementation13 Jan 2020 David M. Zoltowski, Jonathan W. Pillow, Scott W. Linderman

An open question in systems and computational neuroscience is how neural circuits accumulate evidence towards a decision.

Decision Making Open-Ended Question Answering

Poisson-Randomized Gamma Dynamical Systems

1 code implementation NeurIPS 2019 Aaron Schein, Scott W. Linderman, Mingyuan Zhou, David M. Blei, Hanna Wallach

This paper presents the Poisson-randomized gamma dynamical system (PRGDS), a model for sequentially observed count tensors that encodes a strong inductive bias toward sparsity and burstiness.

Inductive Bias

Dose-response modeling in high-throughput cancer drug screenings: An end-to-end approach

1 code implementation13 Dec 2018 Wesley Tansey, Kathy Li, Haoran Zhang, Scott W. Linderman, Raul Rabadan, David M. Blei, Chris H. Wiggins

Personalized cancer treatments based on the molecular profile of a patient's tumor are an emerging and exciting class of treatments in oncology.


Reparameterizing the Birkhoff Polytope for Variational Permutation Inference

no code implementations26 Oct 2017 Scott W. Linderman, Gonzalo E. Mena, Hal Cooper, Liam Paninski, John P. Cunningham

Many matching, tracking, sorting, and ranking problems require probabilistic reasoning about possible permutations, a set that grows factorially with dimension.

Bayesian Inference Combinatorial Optimization +1

Variational Sequential Monte Carlo

1 code implementation31 May 2017 Christian A. Naesseth, Scott W. Linderman, Rajesh Ranganath, David M. Blei

The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior.

Bayesian Inference Variational Inference

Bayesian latent structure discovery from multi-neuron recordings

2 code implementations NeurIPS 2016 Scott W. Linderman, Ryan P. Adams, Jonathan W. Pillow

Neural circuits contain heterogeneous groups of neurons that differ in type, location, connectivity, and basic response properties.

Bayesian Inference Clustering +1

Recurrent switching linear dynamical systems

1 code implementation26 Oct 2016 Scott W. Linderman, Andrew C. Miller, Ryan P. Adams, David M. Blei, Liam Paninski, Matthew J. Johnson

Many natural systems, such as neurons firing in the brain or basketball teams traversing a court, give rise to time series data with complex, nonlinear dynamics.

Bayesian Inference Time Series +1

Reparameterization Gradients through Acceptance-Rejection Sampling Algorithms

2 code implementations18 Oct 2016 Christian A. Naesseth, Francisco J. R. Ruiz, Scott W. Linderman, David M. Blei

Variational inference using the reparameterization trick has enabled large-scale approximate Bayesian inference in complex probabilistic models, leveraging stochastic optimization to sidestep intractable expectations.

Bayesian Inference Stochastic Optimization +1

Scalable Bayesian Inference for Excitatory Point Process Networks

1 code implementation12 Jul 2015 Scott W. Linderman, Ryan P. Adams

We build on previous work that has taken a Bayesian approach to this problem, specifying prior distributions over the latent network structure and a likelihood of observed activity given this network.

Bayesian Inference Variational Inference

Dependent Multinomial Models Made Easy: Stick Breaking with the Pólya-Gamma Augmentation

1 code implementation18 Jun 2015 Scott W. Linderman, Matthew J. Johnson, Ryan P. Adams

Many practical modeling problems involve discrete data that are best represented as draws from multinomial or categorical distributions.

Bayesian Inference Position

Discovering Latent Network Structure in Point Process Data

no code implementations4 Feb 2014 Scott W. Linderman, Ryan P. Adams

Networks play a central role in modern data analysis, enabling us to reason about systems by studying the relationships between their parts.

Point Processes

Cannot find the paper you are looking for? You can Submit a new open access paper.