no code implementations • 1 Feb 2023 • Taiga Abe, E. Kelly Buchanan, Geoff Pleiss, John P. Cunningham
Here we demonstrate that these intuitions do not apply to high-capacity neural network ensembles (deep ensembles), and in fact the opposite is often true.
no code implementations • NeurIPS 2021 • Yixin Wang, David M. Blei, John P. Cunningham
Unfortunately, variational autoencoders often suffer from posterior collapse: the posterior of the latent variables is equal to its prior, rendering the variational autoencoder useless as a means to produce meaningful representations.
1 code implementation • 30 Nov 2022 • Gabriel Loaiza-Ganem, Brendan Leigh Ross, Luhuan Wu, John P. Cunningham, Jesse C. Cresswell, Anthony L. Caterini
Likelihood-based deep generative models have recently been shown to exhibit pathological behaviour under the manifold hypothesis as a consequence of using high-dimensional densities to model data with low-dimensional structure.
1 code implementation • 30 May 2022 • Jonathan Wenger, Geoff Pleiss, Marvin Pförtner, Philipp Hennig, John P. Cunningham
For any method in this class, we prove (i) convergence of its posterior mean in the associated RKHS, (ii) decomposability of its combined posterior covariance into mathematical and computational covariances, and (iii) that the combined variance is a tight worst-case bound for the squared error between the method's posterior mean and the latent function.
1 code implementation • 20 May 2022 • Elliott Gordon-Rodriguez, Thomas P. Quinn, John P. Cunningham
Our work extends the success of data augmentation to compositional data, i. e., simplex-valued data, which is of particular interest in the context of the human microbiome.
2 code implementations • 28 Apr 2022 • Elliott Gordon-Rodriguez, Gabriel Loaiza-Ganem, Andres Potapczynski, John P. Cunningham
This family enjoys remarkable mathematical simplicity; its density function resembles that of the Dirichlet distribution, but with a normalizing constant that can be written in closed form using elementary functions only.
1 code implementation • 14 Feb 2022 • Taiga Abe, E. Kelly Buchanan, Geoff Pleiss, Richard Zemel, John P. Cunningham
While deep ensembles are a practical way to achieve improvements to predictive power, uncertainty quantification, and robustness, our results show that these improvements can be replicated by a (larger) single model.
1 code implementation • 7 Dec 2021 • Yao Fu, John P. Cunningham, Mirella Lapata
Here, we propose a family of randomized dynamic programming (RDP) algorithms for scaling structured models to tens of thousands of latent states.
no code implementations • 1 Jul 2021 • Jonathan Wenger, Geoff Pleiss, Philipp Hennig, John P. Cunningham, Jacob R. Gardner
While preconditioning is well understood in the context of CG, we demonstrate that it can also accelerate convergence and reduce variance of the estimates for the log-determinant and its derivative.
1 code implementation • NeurIPS 2021 • Geoff Pleiss, John P. Cunningham
Our analysis in this paper decouples capacity and width via the generalization of neural networks to Deep Gaussian Processes (Deep GP), a class of nonparametric hierarchical models that subsume neural nets.
1 code implementation • NeurIPS 2021 • Anthony L. Caterini, Gabriel Loaiza-Ganem, Geoff Pleiss, John P. Cunningham
Normalizing flows are invertible neural networks with tractable change-of-volume terms, which allow optimization of their parameters to be efficiently performed via maximum likelihood.
no code implementations • 3 Mar 2021 • Rohan Shad, Nicolas Quach, Robyn Fong, Patpilai Kasinpila, Cayley Bowles, Kate M. Callon, Michelle C. Li, Jeffrey Teuteberg, John P. Cunningham, Curtis P. Langlotz, William Hiesinger
Integrating methods for time-to-event prediction with diagnostic imaging modalities is of considerable interest, as accurate estimates of survival requires accounting for censoring of individuals within the observation period.
no code implementations • 2 Mar 2021 • Rohan Shad, John P. Cunningham, Euan A. Ashley, Curtis P. Langlotz, William Hiesinger
Advances in computing power, deep learning architectures, and expert labelled datasets have spurred the development of medical imaging artificial intelligence systems that rival clinical experts in a variety of scenarios.
no code implementations • 28 Feb 2021 • Rohan Shad, Nicolas Quach, Robyn Fong, Patpilai Kasinpila, Cayley Bowles, Miguel Castro, Ashrith Guha, Eddie Suarez, Stefan Jovinge, Sangjin Lee, Theodore Boeve, Myriam Amsallem, Xiu Tang, Francois Haddad, Yasuhiro Shudo, Y. Joseph Woo, Jeffrey Teuteberg, John P. Cunningham, Curt P. Langlotz, William Hiesinger
Non-invasive and cost effective in nature, the echocardiogram allows for a comprehensive assessment of the cardiac musculature and valves.
1 code implementation • 12 Feb 2021 • Andres Potapczynski, Luhuan Wu, Dan Biderman, Geoff Pleiss, John P. Cunningham
In the case of RFF, we show that the bias-to-variance conversion is indeed a trade-off: the additional variance proves detrimental to optimization.
1 code implementation • NeurIPS 2020 • Anqi Wu, E. Kelly Buchanan, Matthew Whiteway, Michael Schartner, Guido Meijer, Jean-Paul Noel, Erica Rodriguez, Claire Everett, Amy Norovich, Evan Schaffer, Neeli Mishra, C. Daniel Salzman, Dora Angelaki, Andrés Bendesky, The International Brain Laboratory The International Brain Laboratory, John P. Cunningham, Liam Paninski
Noninvasive behavioral tracking of animals is crucial for many scientific investigations.
no code implementations • NeurIPS 2020 • Joshua Glaser, Matthew Whiteway, John P. Cunningham, Liam Paninski, Scott Linderman
We allow the nature of these interactions to change over time by using a discrete set of dynamical states.
2 code implementations • NeurIPS Workshop ICBINB 2020 • Elliott Gordon-Rodriguez, Gabriel Loaiza-Ganem, Geoff Pleiss, John P. Cunningham
Modern deep learning is primarily an experimental science, in which empirical advances occasionally come at the expense of probabilistic rigor.
no code implementations • 11 Mar 2020 • Jackson Loper, David Blei, John P. Cunningham, Liam Paninski
Gaussian Processes (GPs) provide powerful probabilistic frameworks for interpolation, forecasting, and smoothing, but have been hampered by computational scaling issues.
2 code implementations • ICML 2020 • Elliott Gordon-Rodriguez, Gabriel Loaiza-Ganem, John P. Cunningham
Simplex-valued data appear throughout statistics and machine learning, for example in the context of transfer learning and compression of deep networks.
2 code implementations • NeurIPS 2019 • Yao Fu, Yansong Feng, John P. Cunningham
Inspired by variational autoencoders with discrete latent structures, in this work, we propose a latent bag of words (BOW) model for paraphrase generation.
1 code implementation • NeurIPS 2020 • Andres Potapczynski, Gabriel Loaiza-Ganem, John P. Cunningham
The Gumbel-Softmax is a continuous distribution over the simplex that is often used as a relaxation of discrete distributions.
1 code implementation • NeurIPS 2019 • Eleanor Batty, Matthew Whiteway, Shreya Saxena, Dan Biderman, Taiga Abe, Simon Musall, Winthrop Gillis, Jeffrey Markowitz, Anne Churchland, John P. Cunningham, Sandeep R. Datta, Scott Linderman, Liam Paninski
Here we introduce a probabilistic framework for the analysis of behavioral video and neural activity.
2 code implementations • NeurIPS 2019 • Gabriel Loaiza-Ganem, John P. Cunningham
Variational autoencoders (VAE) have quickly become a central tool in machine learning, applicable to a broad range of data types and latent variable models.
no code implementations • ICLR Workshop DeepGenStruct 2019 • Gabriel Loaiza-Ganem, John P. Cunningham
Gaussian processes are the leading class of distributions on random functions, but they suffer from well known issues including difficulty scaling and inflexibility with respect to certain shape constraints (such as nonnegativity).
no code implementations • 18 Mar 2019 • Sean R. Bittner, John P. Cunningham
Recently much attention has been paid to deep generative models, since they have been used to great success for variational inference, generation of complex data types, and more.
2 code implementations • NeurIPS 2019 • Gabriel Loaiza-Ganem, Sean M. Perkins, Karen E. Schroeder, Mark M. Churchland, John P. Cunningham
Gaussian processes are the leading class of distributions on random functions, but they suffer from well known issues including difficulty scaling and inflexibility with respect to certain shape constraints (such as nonnegativity).
no code implementations • 1 Dec 2018 • Andrew C. Miller, Ziad Obermeyer, David M. Blei, John P. Cunningham, Sendhil Mullainathan
An electrocardiogram (EKG) is a common, non-invasive test that measures the electrical activity of a patient's heart.
1 code implementation • 26 May 2018 • Gia-Lac Tran, Edwin V. Bonilla, John P. Cunningham, Pietro Michiardi, Maurizio Filippone
The wide adoption of Convolutional Neural Networks (CNNs) in applications where decision-making under uncertainty is fundamental, has brought a great deal of attention to the ability of these models to accurately quantify the uncertainty in their predictions.
no code implementations • 25 May 2018 • Andrea Insabato, John P. Cunningham, Matthieu Gilson
Estimation of reliable whole-brain connectivity is a crucial step towards the use of connectivity information in quantitative approaches to the study of neuropsychiatric disorders.
no code implementations • 26 Oct 2017 • Scott W. Linderman, Gonzalo E. Mena, Hal Cooper, Liam Paninski, John P. Cunningham
Many matching, tracking, sorting, and ranking problems require probabilistic reasoning about possible permutations, a set that grows factorially with dimension.
no code implementations • 12 Jan 2017 • Gabriel Loaiza-Ganem, Yuanjun Gao, John P. Cunningham
Maximum entropy modeling is a flexible and popular framework for formulating statistical models given partial knowledge.
no code implementations • NeurIPS 2016 • Yuanjun Gao, Evan Archer, Liam Paninski, John P. Cunningham
A body of recent work in modeling neural activity focuses on recovering low-dimensional latent features that capture the statistical structure of large-scale neural populations.
no code implementations • 7 Mar 2016 • Seth Flaxman, Dino Sejdinovic, John P. Cunningham, Sarah Filippi
The posterior mean of our model is closely related to recently proposed shrinkage estimators for kernel mean embeddings, while the posterior uncertainty is a new, interesting feature with various possible applications.
1 code implementation • 22 Feb 2016 • Kurt Cutajar, Michael A. Osborne, John P. Cunningham, Maurizio Filippone
Preconditioning is a common approach to alleviating this issue.
no code implementations • NeurIPS 2015 • Jacob Gardner, Gustavo Malkomes, Roman Garnett, Kilian Q. Weinberger, Dennis Barbour, John P. Cunningham
Using this and a previously published model for healthy responses, the proposed method is shown to be capable of diagnosing the presence or absence of NIHL with drastically fewer samples than existing approaches.
1 code implementation • NeurIPS 2015 • Yuanjun Gao, Lars Busing, Krishna V. Shenoy, John P. Cunningham
Latent factor models have been widely used to analyze simultaneous recordings of spike trains from large, heterogeneous neural populations.
no code implementations • 13 Nov 2015 • Josh Merel, David Carlson, Liam Paninski, John P. Cunningham
We describe how training a decoder in this way is a novel variant of an imitation learning problem, where an oracle or expert is employed for supervised training in lieu of direct observations, which are not available.
no code implementations • 16 Jul 2015 • Stephan Mandt, Florian Wenzel, Shinichi Nakajima, John P. Cunningham, Christoph Lippert, Marius Kloft
Formulated as models for linear regression, LMMs have been restricted to continuous phenotypes.
2 code implementations • 16 Dec 2014 • Aki Vehtari, Andrew Gelman, Tuomas Sivula, Pasi Jylänki, Dustin Tran, Swupnil Sahai, Paul Blomstedt, John P. Cunningham, David Schiminovich, Christian Robert
A common divide-and-conquer approach for Bayesian computation with big data is to partition the data, perform local inference for each piece separately, and combine the results to obtain a global posterior approximation.
no code implementations • NeurIPS 2014 • Andrew G. Wilson, Elad Gilboa, Arye Nehorai, John P. Cunningham
This difficulty is compounded by the fact that Gaussian processes are typically only tractable for small datasets, and scaling an expressive kernel learning approach poses different challenges than scaling a standard Gaussian process model.
no code implementations • NeurIPS 2014 • Lars Buesing, Timothy A. Machado, John P. Cunningham, Liam Paninski
High-dimensional, simultaneous recordings of neural spiking activity are often explored, analyzed and visualized with the help of latent variable or factor models.
1 code implementation • 3 Jun 2014 • John P. Cunningham, Zoubin Ghahramani
Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data.
no code implementations • 20 Oct 2013 • Andrew Gordon Wilson, Elad Gilboa, Arye Nehorai, John P. Cunningham
We introduce a new Bayesian nonparametric framework -- GPatt -- enabling automatic pattern extrapolation with Gaussian processes on large multidimensional datasets.
no code implementations • NeurIPS 2011 • Biljana Petreska, Byron M. Yu, John P. Cunningham, Gopal Santhanam, Stephen I. Ryu, Krishna V. Shenoy, Maneesh Sahani
Simultaneous recordings of many neurons embedded within a recurrently-connected cortical network may provide concurrent views into the dynamical processes of that network, and thus its computational function.
no code implementations • NeurIPS 2011 • Jakob H. Macke, Lars Buesing, John P. Cunningham, Byron M. Yu, Krishna V. Shenoy, Maneesh Sahani
Neurons in the neocortex code and compute as part of a locally interconnected population.
no code implementations • 29 Nov 2011 • John P. Cunningham, Philipp Hennig, Simon Lacoste-Julien
We consider these unexpected results empirically and theoretically, both for the problem of Gaussian probabilities and for EP more generally.
no code implementations • NeurIPS 2008 • Byron M. Yu, John P. Cunningham, Gopal Santhanam, Stephen I. Ryu, Krishna V. Shenoy, Maneesh Sahani
We applied these methods to the activity of 61 neurons recorded simultaneously in macaque premotor and motor cortices during reach planning and execution.