Search Results for author: John P. Cunningham

Found 46 papers, 20 papers with code

Posterior and Computational Uncertainty in Gaussian Processes

1 code implementation30 May 2022 Jonathan Wenger, Geoff Pleiss, Marvin Pförtner, Philipp Hennig, John P. Cunningham

For any method in this class, we prove (i) convergence of its posterior mean in the associated RKHS, (ii) decomposability of its combined posterior covariance into mathematical and computational covariances, and (iii) that the combined variance is a tight worst-case bound for the squared error between the method's posterior mean and the latent function.

Gaussian Processes

Data Augmentation for Compositional Data: Advancing Predictive Models of the Microbiome

1 code implementation20 May 2022 Elliott Gordon-Rodriguez, Thomas P. Quinn, John P. Cunningham

Our work extends the success of data augmentation to compositional data, i. e., simplex-valued data, which is of particular interest in the context of the human microbiome.

Contrastive Learning Data Augmentation +3

On the Normalizing Constant of the Continuous Categorical Distribution

1 code implementation28 Apr 2022 Elliott Gordon-Rodriguez, Gabriel Loaiza-Ganem, Andres Potapczynski, John P. Cunningham

This family enjoys remarkable mathematical simplicity; its density function resembles that of the Dirichlet distribution, but with a normalizing constant that can be written in closed form using elementary functions only.

Deep Ensembles Work, But Are They Necessary?

no code implementations14 Feb 2022 Taiga Abe, E. Kelly Buchanan, Geoff Pleiss, Richard Zemel, John P. Cunningham

First, we show that ensemble diversity, by any metric, does not meaningfully contribute to an ensemble's ability to detect out-of-distribution (OOD) data, and that one can estimate ensemble diversity by measuring the relative improvement of a single larger model.

Scaling Structured Inference with Randomization

no code implementations7 Dec 2021 Yao Fu, John P. Cunningham, Mirella Lapata

Here, we propose a family of randomized dynamic programming (RDP) algorithms for scaling structured models to tens of thousands of latent states.

Posterior Collapse and Latent Variable Non-identifiability

no code implementations NeurIPS 2021 Yixin Wang, David Blei, John P. Cunningham

Existing approaches to posterior collapse oftenattribute it to the use of neural networks or optimization issues dueto variational approximation.

Preconditioning for Scalable Gaussian Process Hyperparameter Optimization

no code implementations1 Jul 2021 Jonathan Wenger, Geoff Pleiss, Philipp Hennig, John P. Cunningham, Jacob R. Gardner

While preconditioning is well understood in the context of CG, we demonstrate that it can also accelerate convergence and reduce variance of the estimates for the log-determinant and its derivative.

Gaussian Processes Hyperparameter Optimization

The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective

1 code implementation NeurIPS 2021 Geoff Pleiss, John P. Cunningham

Our analysis in this paper decouples capacity and width via the generalization of neural networks to Deep Gaussian Processes (Deep GP), a class of nonparametric hierarchical models that subsume neural nets.

Gaussian Processes L2 Regularization

Rectangular Flows for Manifold Learning

1 code implementation NeurIPS 2021 Anthony L. Caterini, Gabriel Loaiza-Ganem, Geoff Pleiss, John P. Cunningham

Normalizing flows are invertible neural networks with tractable change-of-volume terms, which allow optimization of their parameters to be efficiently performed via maximum likelihood.

Density Estimation Out-of-Distribution Detection

Simulating time to event prediction with spatiotemporal echocardiography deep learning

no code implementations3 Mar 2021 Rohan Shad, Nicolas Quach, Robyn Fong, Patpilai Kasinpila, Cayley Bowles, Kate M. Callon, Michelle C. Li, Jeffrey Teuteberg, John P. Cunningham, Curtis P. Langlotz, William Hiesinger

Integrating methods for time-to-event prediction with diagnostic imaging modalities is of considerable interest, as accurate estimates of survival requires accounting for censoring of individuals within the observation period.

Time-to-Event Prediction

Medical Imaging and Machine Learning

no code implementations2 Mar 2021 Rohan Shad, John P. Cunningham, Euan A. Ashley, Curtis P. Langlotz, William Hiesinger

Advances in computing power, deep learning architectures, and expert labelled datasets have spurred the development of medical imaging artificial intelligence systems that rival clinical experts in a variety of scenarios.

Bias-Free Scalable Gaussian Processes via Randomized Truncations

1 code implementation12 Feb 2021 Andres Potapczynski, Luhuan Wu, Dan Biderman, Geoff Pleiss, John P. Cunningham

In the case of RFF, we show that the bias-to-variance conversion is indeed a trade-off: the additional variance proves detrimental to optimization.

Gaussian Processes

Linear-time inference for Gaussian Processes on one dimension

no code implementations11 Mar 2020 Jackson Loper, David Blei, John P. Cunningham, Liam Paninski

Gaussian Processes (GPs) provide powerful probabilistic frameworks for interpolation, forecasting, and smoothing, but have been hampered by computational scaling issues.

Gaussian Processes Time Series

The continuous categorical: a novel simplex-valued exponential family

2 code implementations ICML 2020 Elliott Gordon-Rodriguez, Gabriel Loaiza-Ganem, John P. Cunningham

Simplex-valued data appear throughout statistics and machine learning, for example in the context of transfer learning and compression of deep networks.

Neural Network Compression Transfer Learning

Paraphrase Generation with Latent Bag of Words

2 code implementations NeurIPS 2019 Yao Fu, Yansong Feng, John P. Cunningham

Inspired by variational autoencoders with discrete latent structures, in this work, we propose a latent bag of words (BOW) model for paraphrase generation.

Natural Language Processing Paraphrase Generation +1

Invertible Gaussian Reparameterization: Revisiting the Gumbel-Softmax

1 code implementation NeurIPS 2020 Andres Potapczynski, Gabriel Loaiza-Ganem, John P. Cunningham

The Gumbel-Softmax is a continuous distribution over the simplex that is often used as a relaxation of discrete distributions.

The continuous Bernoulli: fixing a pervasive error in variational autoencoders

2 code implementations NeurIPS 2019 Gabriel Loaiza-Ganem, John P. Cunningham

Variational autoencoders (VAE) have quickly become a central tool in machine learning, applicable to a broad range of data types and latent variable models.

Deep Random Splines for Point Process Intensity Estimation

no code implementations ICLR Workshop DeepGenStruct 2019 Gabriel Loaiza-Ganem, John P. Cunningham

Gaussian processes are the leading class of distributions on random functions, but they suffer from well known issues including difficulty scaling and inflexibility with respect to certain shape constraints (such as nonnegativity).

Gaussian Processes Point Processes

Approximating exponential family models (not single distributions) with a two-network architecture

no code implementations18 Mar 2019 Sean R. Bittner, John P. Cunningham

Recently much attention has been paid to deep generative models, since they have been used to great success for variational inference, generation of complex data types, and more.

General Classification Variational Inference

Deep Random Splines for Point Process Intensity Estimation of Neural Population Data

2 code implementations NeurIPS 2019 Gabriel Loaiza-Ganem, Sean M. Perkins, Karen E. Schroeder, Mark M. Churchland, John P. Cunningham

Gaussian processes are the leading class of distributions on random functions, but they suffer from well known issues including difficulty scaling and inflexibility with respect to certain shape constraints (such as nonnegativity).

Dimensionality Reduction Gaussian Processes +1

A Probabilistic Model of Cardiac Physiology and Electrocardiograms

no code implementations1 Dec 2018 Andrew C. Miller, Ziad Obermeyer, David M. Blei, John P. Cunningham, Sendhil Mullainathan

An electrocardiogram (EKG) is a common, non-invasive test that measures the electrical activity of a patient's heart.

Calibrating Deep Convolutional Gaussian Processes

1 code implementation26 May 2018 Gia-Lac Tran, Edwin V. Bonilla, John P. Cunningham, Pietro Michiardi, Maurizio Filippone

The wide adoption of Convolutional Neural Networks (CNNs) in applications where decision-making under uncertainty is fundamental, has brought a great deal of attention to the ability of these models to accurately quantify the uncertainty in their predictions.

Decision Making Decision Making Under Uncertainty +3

Bayesian estimation for large scale multivariate Ornstein-Uhlenbeck model of brain connectivity

no code implementations25 May 2018 Andrea Insabato, John P. Cunningham, Matthieu Gilson

Estimation of reliable whole-brain connectivity is a crucial step towards the use of connectivity information in quantitative approaches to the study of neuropsychiatric disorders.

Reparameterizing the Birkhoff Polytope for Variational Permutation Inference

no code implementations26 Oct 2017 Scott W. Linderman, Gonzalo E. Mena, Hal Cooper, Liam Paninski, John P. Cunningham

Many matching, tracking, sorting, and ranking problems require probabilistic reasoning about possible permutations, a set that grows factorially with dimension.

Bayesian Inference Combinatorial Optimization +1

Maximum Entropy Flow Networks

no code implementations12 Jan 2017 Gabriel Loaiza-Ganem, Yuanjun Gao, John P. Cunningham

Maximum entropy modeling is a flexible and popular framework for formulating statistical models given partial knowledge.

Computer Vision Stochastic Optimization

Linear dynamical neural population models through nonlinear embeddings

1 code implementation NeurIPS 2016 Yuanjun Gao, Evan Archer, Liam Paninski, John P. Cunningham

A body of recent work in modeling neural activity focuses on recovering low-dimensional latent features that capture the statistical structure of large-scale neural populations.

Variational Inference

Bayesian Learning of Kernel Embeddings

no code implementations7 Mar 2016 Seth Flaxman, Dino Sejdinovic, John P. Cunningham, Sarah Filippi

The posterior mean of our model is closely related to recently proposed shrinkage estimators for kernel mean embeddings, while the posterior uncertainty is a new, interesting feature with various possible applications.

Bayesian Inference

Preconditioning Kernel Matrices

1 code implementation22 Feb 2016 Kurt Cutajar, Michael A. Osborne, John P. Cunningham, Maurizio Filippone

Preconditioning is a common approach to alleviating this issue.

High-dimensional neural spike train analysis with generalized count linear dynamical systems

1 code implementation NeurIPS 2015 Yuanjun Gao, Lars Busing, Krishna V. Shenoy, John P. Cunningham

Latent factor models have been widely used to analyze simultaneous recordings of spike trains from large, heterogeneous neural populations.

Variational Inference

Bayesian Active Model Selection with an Application to Automated Audiometry

no code implementations NeurIPS 2015 Jacob Gardner, Gustavo Malkomes, Roman Garnett, Kilian Q. Weinberger, Dennis Barbour, John P. Cunningham

Using this and a previously published model for healthy responses, the proposed method is shown to be capable of diagnosing the presence or absence of NIHL with drastically fewer samples than existing approaches.

Model Selection

Neuroprosthetic decoder training as imitation learning

no code implementations13 Nov 2015 Josh Merel, David Carlson, Liam Paninski, John P. Cunningham

We describe how training a decoder in this way is a novel variant of an imitation learning problem, where an oracle or expert is employed for supervised training in lieu of direct observations, which are not available.

Imitation Learning

Sparse Probit Linear Mixed Model

no code implementations16 Jul 2015 Stephan Mandt, Florian Wenzel, Shinichi Nakajima, John P. Cunningham, Christoph Lippert, Marius Kloft

Formulated as models for linear regression, LMMs have been restricted to continuous phenotypes.

feature selection

Expectation propagation as a way of life: A framework for Bayesian inference on partitioned data

2 code implementations16 Dec 2014 Aki Vehtari, Andrew Gelman, Tuomas Sivula, Pasi Jylänki, Dustin Tran, Swupnil Sahai, Paul Blomstedt, John P. Cunningham, David Schiminovich, Christian Robert

A common divide-and-conquer approach for Bayesian computation with big data is to partition the data, perform local inference for each piece separately, and combine the results to obtain a global posterior approximation.

Bayesian Inference

Fast Kernel Learning for Multidimensional Pattern Extrapolation

no code implementations NeurIPS 2014 Andrew G. Wilson, Elad Gilboa, Arye Nehorai, John P. Cunningham

This difficulty is compounded by the fact that Gaussian processes are typically only tractable for small datasets, and scaling an expressive kernel learning approach poses different challenges than scaling a standard Gaussian process model.

Gaussian Processes

Clustered factor analysis of multineuronal spike data

no code implementations NeurIPS 2014 Lars Buesing, Timothy A. Machado, John P. Cunningham, Liam Paninski

High-dimensional, simultaneous recordings of neural spiking activity are often explored, analyzed and visualized with the help of latent variable or factor models.

Variational Inference

Linear Dimensionality Reduction: Survey, Insights, and Generalizations

1 code implementation3 Jun 2014 John P. Cunningham, Zoubin Ghahramani

Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data.

Dimensionality Reduction Metric Learning

GPatt: Fast Multidimensional Pattern Extrapolation with Gaussian Processes

no code implementations20 Oct 2013 Andrew Gordon Wilson, Elad Gilboa, Arye Nehorai, John P. Cunningham

We introduce a new Bayesian nonparametric framework -- GPatt -- enabling automatic pattern extrapolation with Gaussian processes on large multidimensional datasets.

Gaussian Processes

Dynamical segmentation of single trials from population neural data

no code implementations NeurIPS 2011 Biljana Petreska, Byron M. Yu, John P. Cunningham, Gopal Santhanam, Stephen I. Ryu, Krishna V. Shenoy, Maneesh Sahani

Simultaneous recordings of many neurons embedded within a recurrently-connected cortical network may provide concurrent views into the dynamical processes of that network, and thus its computational function.

Gaussian Probabilities and Expectation Propagation

no code implementations29 Nov 2011 John P. Cunningham, Philipp Hennig, Simon Lacoste-Julien

We consider these unexpected results empirically and theoretically, both for the problem of Gaussian probabilities and for EP more generally.

Cannot find the paper you are looking for? You can Submit a new open access paper.