no code implementations • 10 Jul 2024 • Jakub Smékal, Jimmy T. H. Smith, Michael Kleinman, Dan Biderman, Scott W. Linderman

State space models (SSMs) have shown remarkable empirical performance on many long sequence modeling tasks, but a theoretical understanding of these models is still lacking.

1 code implementation • NeurIPS 2023 • Hyun Dong Lee, Andrew Warrington, Joshua I. Glaser, Scott W. Linderman

In contrast, SLDSs can capture long-range dependencies in a parameter efficient way through Markovian latent dynamics, but present an intractable likelihood and a challenging parameter estimation task.

no code implementations • 25 May 2023 • Yixiu Zhao, Scott W. Linderman

Here, we revisit SVAEs using modern machine learning tools and demonstrate their advantages over more general alternatives in terms of both accuracy and efficiency.

3 code implementations • 9 Aug 2022 • Jimmy T. H. Smith, Andrew Warrington, Scott W. Linderman

Models using structured state space sequence (S4) layers have achieved state-of-the-art performance on long-range sequence modeling tasks.

Ranked #3 on Long-range modeling on LRA

1 code implementation • 13 Jan 2022 • Yixin Wang, Anthony Degleris, Alex H. Williams, Scott W. Linderman

This construction is similar to Bayesian nonparametric mixture models like the Dirichlet process mixture model (DPMM) in that the number of latent events (i. e. clusters) is a random variable, but the point process formulation makes the NSP especially well suited to modeling spatiotemporal data.

1 code implementation • NeurIPS 2021 • Jimmy T. H. Smith, Scott W. Linderman, David Sussillo

The results are a trained SLDS variant that closely approximates the RNN, an auxiliary function that can produce a fixed point for each point in state-space, and a trained nonlinear RNN whose dynamics have been regularized such that its first-order terms perform the computation, if possible.

2 code implementations • NeurIPS 2021 • Alex H. Williams, Erin Kunz, Simon Kornblith, Scott W. Linderman

In doing so, we identify relationships between neural representations that are interpretable in terms of anatomical features and model performance.

no code implementations • 8 Mar 2021 • Alex H. Williams, Scott W. Linderman

Individual neurons often produce highly variable responses over nominally identical trials, reflecting a mixture of intrinsic "noise" and systematic changes in the animal's cognitive and behavioral state.

1 code implementation • 20 Jan 2021 • Xinwei Yu, Matthew S. Creamer, Francesco Randi, Anuj K. Sharma, Scott W. Linderman, Andrew M. Leifer

The same pre-trained model both tracks neurons across time and identifies corresponding neurons across individuals.

1 code implementation • NeurIPS 2020 • Alex H. Williams, Anthony Degleris, Yixin Wang, Scott W. Linderman

Sparse sequences of neural spikes are posited to underlie aspects of working memory, motor production, and learning.

1 code implementation • 13 Jan 2020 • David M. Zoltowski, Jonathan W. Pillow, Scott W. Linderman

An open question in systems and computational neuroscience is how neural circuits accumulate evidence towards a decision.

1 code implementation • NeurIPS 2019 • Aaron Schein, Scott W. Linderman, Mingyuan Zhou, David M. Blei, Hanna Wallach

This paper presents the Poisson-randomized gamma dynamical system (PRGDS), a model for sequentially observed count tensors that encodes a strong inductive bias toward sparsity and burstiness.

1 code implementation • 13 Dec 2018 • Wesley Tansey, Kathy Li, Haoran Zhang, Scott W. Linderman, Raul Rabadan, David M. Blei, Chris H. Wiggins

Personalized cancer treatments based on the molecular profile of a patient's tumor are an emerging and exciting class of treatments in oncology.

Applications

1 code implementation • ICLR 2019 • Josue Nassar, Scott W. Linderman, Monica Bugallo, Il Memming Park

Many real-world systems studied are governed by complex, nonlinear dynamics.

no code implementations • 26 Oct 2017 • Scott W. Linderman, Gonzalo E. Mena, Hal Cooper, Liam Paninski, John P. Cunningham

Many matching, tracking, sorting, and ranking problems require probabilistic reasoning about possible permutations, a set that grows factorially with dimension.

1 code implementation • 31 May 2017 • Christian A. Naesseth, Scott W. Linderman, Rajesh Ranganath, David M. Blei

The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior.

2 code implementations • NeurIPS 2016 • Scott W. Linderman, Ryan P. Adams, Jonathan W. Pillow

Neural circuits contain heterogeneous groups of neurons that differ in type, location, connectivity, and basic response properties.

1 code implementation • 26 Oct 2016 • Scott W. Linderman, Andrew C. Miller, Ryan P. Adams, David M. Blei, Liam Paninski, Matthew J. Johnson

Many natural systems, such as neurons firing in the brain or basketball teams traversing a court, give rise to time series data with complex, nonlinear dynamics.

2 code implementations • 18 Oct 2016 • Christian A. Naesseth, Francisco J. R. Ruiz, Scott W. Linderman, David M. Blei

Variational inference using the reparameterization trick has enabled large-scale approximate Bayesian inference in complex probabilistic models, leveraging stochastic optimization to sidestep intractable expectations.

1 code implementation • 12 Jul 2015 • Scott W. Linderman, Ryan P. Adams

We build on previous work that has taken a Bayesian approach to this problem, specifying prior distributions over the latent network structure and a likelihood of observed activity given this network.

1 code implementation • 18 Jun 2015 • Scott W. Linderman, Matthew J. Johnson, Ryan P. Adams

Many practical modeling problems involve discrete data that are best represented as draws from multinomial or categorical distributions.

no code implementations • 27 Nov 2014 • Scott W. Linderman, Matthew J. Johnson, Matthew A. Wilson, Zhe Chen

Rodent hippocampal population codes represent important spatial information about the environment during navigation.

no code implementations • NeurIPS 2014 • Scott W. Linderman, Christopher H. Stock, Ryan P. Adams

Learning and memory in the brain are implemented by complex, time-varying changes in neural circuitry.

no code implementations • 4 Feb 2014 • Scott W. Linderman, Ryan P. Adams

Networks play a central role in modern data analysis, enabling us to reason about systems by studying the relationships between their parts.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.