no code implementations • 7 Feb 2025 • Kevin Han Huang, Ni Zhan, Elif Ertekin, Peter Orbanz, Ryan P. Adams
Incorporating group symmetries into neural networks has been a cornerstone of success in many AI-for-science applications.
no code implementations • 29 Jun 2024 • Olga Solodova, Nick Richardson, Deniz Oktay, Ryan P. Adams
Message passing graph neural networks (GNNs) would appear to be powerful tools to learn distributed algorithms via gradient descent, but generate catastrophically incorrect predictions when nodes update asynchronously during inference.
1 code implementation • 19 Oct 2023 • Sulin Liu, Peter J. Ramadge, Ryan P. Adams
We introduce marginalization models (MAMs), a new family of generative models for high-dimensional discrete data.
no code implementations • 8 Jun 2023 • Ryan P. Adams, Peter Orbanz
The linear representation generalizes the Fourier basis to crystallographically invariant basis functions.
no code implementations • 31 Jan 2023 • Deniz Oktay, Mehran Mirramezani, Eder Medina, Ryan P. Adams
In this work, we seek to develop machine learning analogs of this process, in which we jointly learn the morphology of complex nonlinear elastic solids along with a deep neural network to control it.
no code implementations • 3 Nov 2022 • Tian Qin, Alex Beatson, Deniz Oktay, Nick McGreivy, Ryan P. Adams
Partial differential equations (PDEs) are often computationally challenging to solve, and in many settings many related PDEs must be be solved either at every timestep or for a variety of candidate boundary conditions, parameters, or geometric domains.
no code implementations • 4 Oct 2022 • Diana Cai, Ryan P. Adams
A key challenge in applying MCMC to scientific domains is computation: the target density of interest is often a function of expensive computations, such as a high-fidelity physical simulation, an intractable integral, or a slowly-converging iterative algorithm.
1 code implementation • 22 Dec 2021 • Athindran Ramesh Kumar, Sulin Liu, Jaime F. Fisac, Ryan P. Adams, Peter J. Ramadge
In practice, we have inaccurate knowledge of the system dynamics, which can lead to unsafe behaviors due to unmodeled residual dynamics.
no code implementations • NeurIPS 2021 • David Zoltowski, Diana Cai, Ryan P. Adams
Slice sampling is a Markov chain Monte Carlo algorithm for simulating samples from probability distributions; it only requires a density function that can be evaluated point-wise up to a normalization constant, making it applicable to a variety of inference problems and unnormalized models.
1 code implementation • ICLR 2022 • Ari Seff, Wenda Zhou, Nick Richardson, Ryan P. Adams
Parametric computer-aided design (CAD) tools are the predominant way that engineers specify physical structures, from bicycle pedals to airplanes to printed circuit boards.
no code implementations • NeurIPS 2021 • Dibya Ghosh, Jad Rahme, Aviral Kumar, Amy Zhang, Ryan P. Adams, Sergey Levine
Generalization is a central challenge for the deployment of reinforcement learning (RL) systems in the real world.
1 code implementation • NeurIPS 2021 • Xingyuan Sun, Tianju Xue, Szymon Rusinkiewicz, Ryan P. Adams
We compare our approach to direct optimization of the design using the learned surrogate, and to supervised learning of the synthesis problem.
1 code implementation • 26 Mar 2021 • Gregory W. Gundersen, Diana Cai, Chuteng Zhou, Barbara E. Engelhardt, Ryan P. Adams
We propose a multi-fidelity approach that makes cost-sensitive decisions about which data fidelity to collect based on maximizing information gain with respect to changepoints.
1 code implementation • NeurIPS 2020 • Sulin Liu, Xingyuan Sun, Peter J. Ramadge, Ryan P. Adams
One of the appeals of the GP framework is that the marginal likelihood of the kernel hyperparameters is often available in closed form, enabling optimization and sampling procedures to fit these hyperparameters to data.
1 code implementation • ICLR 2021 • Deniz Oktay, Nick McGreivy, Joshua Aduol, Alex Beatson, Ryan P. Adams
The successes of deep learning, variational inference, and many other fields have been aided by specialized implementations of reverse-mode automatic differentiation (AD) to compute gradients of mega-dimensional objectives.
1 code implementation • 16 Jul 2020 • Ari Seff, Yaniv Ovadia, Wenda Zhou, Ryan P. Adams
Parametric computer-aided design (CAD) is the dominant paradigm in mechanical engineering for physical design.
no code implementations • NeurIPS 2020 • Alex Beatson, Jordan T. Ash, Geoffrey Roeder, Tianju Xue, Ryan P. Adams
We use a neural network to model the stored potential energy in a component given boundary conditions.
no code implementations • ICLR 2020 • Yucen Luo, Alex Beatson, Mohammad Norouzi, Jun Zhu, David Duvenaud, Ryan P. Adams, Ricky T. Q. Chen
Standard variational lower bounds used to train latent variable models produce biased estimates of most quantities of interest.
1 code implementation • NeurIPS 2020 • Jordan T. Ash, Ryan P. Adams
We would like each of these models in the sequence to be performant and take advantage of all the data that are available to that point.
no code implementations • 25 Sep 2019 • Jordan T. Ash, Ryan P. Adams
We would like each of these models in the sequence to be performant and take advantage of all the data that are available to that point.
1 code implementation • NeurIPS 2019 • Ari Seff, Wenda Zhou, Farhan Damani, Abigail Doyle, Ryan P. Adams
The success of generative modeling in continuous domains has led to a surge of interest in generating discrete data such as molecules, source code, and graphs.
no code implementations • 24 Jun 2019 • Jad Rahme, Ryan P. Adams
The central object in the statistical physics abstraction is the idea of a partition function $\mathcal{Z}$, and here we construct a partition function from the ensemble of possible trajectories that an agent might take in a Markov decision process.
no code implementations • NeurIPS 2019 • Igor Fedorov, Ryan P. Adams, Matthew Mattina, Paul N. Whatmough
The vast majority of processors in the world are actually microcontroller units (MCUs), which find widespread use performing simple control tasks in applications ranging from automobiles to medical devices and office equipment.
1 code implementation • 16 May 2019 • Alex Beatson, Ryan P. Adams
We consider optimization problems in which the objective requires an inner loop with many steps or is the limit of a sequence of increasingly costly approximations.
no code implementations • NeurIPS 2018 • Diana Cai, Michael Mitzenmacher, Ryan P. Adams
The count-min sketch is a time- and memory-efficient randomized data structure that provides a point estimate of the number of times an item has appeared in a data stream.
no code implementations • 21 Nov 2018 • Jennifer N. Wei, David Belanger, Ryan P. Adams, D. Sculley
When confronted with a substance of unknown identity, researchers often perform mass spectrometry on the sample and compare the observed spectrum to a library of previously-collected spectra to identify the molecule.
no code implementations • 18 Jul 2018 • Justin Gilmer, Ryan P. Adams, Ian Goodfellow, David Andersen, George E. Dahl
Advances in machine learning have led to broad deployment of systems with impressive performance on important problems.
1 code implementation • ICLR 2019 • Wenda Zhou, Victor Veitch, Morgane Austern, Ryan P. Adams, Peter Orbanz
Our main technical result is a generalization bound for compressed networks based on the compressed size.
1 code implementation • 28 Feb 2018 • Jeffrey Regier, Andrew C. Miller, David Schlegel, Ryan P. Adams, Jon D. McAuliffe, Prabhat
We present a new, fully generative model for constructing astronomical catalogs from optical telescope image sets.
no code implementations • 9 Feb 2018 • Ryan P. Adams, Jeffrey Pennington, Matthew J. Johnson, Jamie Smith, Yaniv Ovadia, Brian Patton, James Saunderson
However, naive eigenvalue estimation is computationally expensive even when the matrix can be represented; in many of these situations the matrix is so large as to only be available implicitly via products with vectors.
1 code implementation • NeurIPS 2017 • Jonathan H. Huggins, Ryan P. Adams, Tamara Broderick
We provide theoretical guarantees on the quality of point (MAP) estimates, the approximate posterior, and posterior mean and uncertainty estimates.
1 code implementation • NeurIPS 2017 • Andrew C. Miller, Nicholas J. Foti, Alexander D'Amour, Ryan P. Adams
Optimization with noisy gradients has become ubiquitous in statistics and machine learning.
no code implementations • 17 Apr 2017 • Ardavan Saeedi, Matthew D. Hoffman, Stephen J. DiVerdi, Asma Ghandeharioun, Matthew J. Johnson, Ryan P. Adams
Professional-grade software applications are powerful but complicated$-$expert users can achieve impressive results, but novices often struggle to complete even basic tasks.
1 code implementation • ICML 2017 • Andrew C. Miller, Nicholas Foti, Ryan P. Adams
We propose a black-box variational inference method to approximate intractable distributions with an increasingly rich approximating class.
2 code implementations • NeurIPS 2016 • Scott W. Linderman, Ryan P. Adams, Jonathan W. Pillow
Neural circuits contain heterogeneous groups of neurons that differ in type, location, connectivity, and basic response properties.
1 code implementation • 26 Oct 2016 • Scott W. Linderman, Andrew C. Miller, Ryan P. Adams, David M. Blei, Liam Paninski, Matthew J. Johnson
Many natural systems, such as neurons firing in the brain or basketball teams traversing a court, give rise to time series data with complex, nonlinear dynamics.
11 code implementations • 7 Oct 2016 • Rafael Gómez-Bombarelli, Jennifer N. Wei, David Duvenaud, José Miguel Hernández-Lobato, Benjamín Sánchez-Lengeling, Dennis Sheberla, Jorge Aguilera-Iparraguirre, Timothy D. Hirzel, Ryan P. Adams, Alán Aspuru-Guzik
We report a method to convert discrete representations of molecules to and from a multidimensional continuous representation.
no code implementations • 19 Jun 2016 • Akash Srivastava, James Zou, Ryan P. Adams, Charles Sutton
A good clustering can help a data analyst to explore and understand a data set, but what constitutes a good clustering may depend on domain-specific and application-specific criteria.
3 code implementations • NeurIPS 2016 • Matthew J. Johnson, David Duvenaud, Alexander B. Wiltschko, Sandeep R. Datta, Ryan P. Adams
We propose a general modeling and inference framework that composes probabilistic graphical models with deep learning methods and combines their respective strengths.
no code implementations • 16 Feb 2016 • Elaine Angelino, Matthew James Johnson, Ryan P. Adams
Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge.
no code implementations • NeurIPS 2015 • Andrew Miller, Albert Wu, Jeff Regier, Jon McAuliffe, Dustin Lang, Mr. Prabhat, David Schlegel, Ryan P. Adams
We propose a method for combining two sources of astronomical data, spectroscopy and photometry, that carry information about sources of light (e. g., stars, galaxies, and quasars) at extremely different spectral resolutions.
no code implementations • NeurIPS 2015 • Scott Linderman, Matthew Johnson, Ryan P. Adams
For example, nucleotides in a DNA sequence, children's names in a given state and year, and text documents are all commonly modeled with multinomial distributions.
1 code implementation • 30 Nov 2015 • José Miguel Hernández-Lobato, Michael A. Gelbart, Ryan P. Adams, Matthew W. Hoffman, Zoubin Ghahramani
Of particular interest to us is to efficiently solve problems with decoupled constraints, in which subsets of the objective and constraint functions may be evaluated independently.
no code implementations • 17 Nov 2015 • Daniel Hernández-Lobato, José Miguel Hernández-Lobato, Amar Shah, Ryan P. Adams
The results show that PESMO produces better recommendations with a smaller number of evaluations of the objectives, and that a decoupled evaluation can lead to improvements in performance, particularly when the number of objectives is large.
no code implementations • 8 Nov 2015 • Roger B. Grosse, Zoubin Ghahramani, Ryan P. Adams
Using the ground truth log-ML estimates obtained from our method, we quantitatively evaluate a wide variety of existing ML estimators on several latent variable models: clustering, a low rank approximation, and a binary attributes model.
8 code implementations • NeurIPS 2015 • David Duvenaud, Dougal Maclaurin, Jorge Aguilera-Iparraguirre, Rafael Gómez-Bombarelli, Timothy Hirzel, Alán Aspuru-Guzik, Ryan P. Adams
We introduce a convolutional neural network that operates directly on graphs.
Ranked #2 on
Drug Discovery
on HIV dataset
1 code implementation • 12 Jul 2015 • Scott W. Linderman, Ryan P. Adams
We build on previous work that has taken a Bayesian approach to this problem, specifying prior distributions over the latent network structure and a likelihood of observed activity given this network.
1 code implementation • 18 Jun 2015 • Scott W. Linderman, Matthew J. Johnson, Ryan P. Adams
Many practical modeling problems involve discrete data that are best represented as draws from multinomial or categorical distributions.
no code implementations • NeurIPS 2015 • Oren Rippel, Jasper Snoek, Ryan P. Adams
In this work, we demonstrate that, beyond its advantages for efficient computation, the spectral domain also provides a powerful representation in which to model and train convolutional neural networks (CNNs).
Ranked #177 on
Image Classification
on CIFAR-100
1 code implementation • 6 Apr 2015 • Dougal Maclaurin, David Duvenaud, Ryan P. Adams
By tracking the change in entropy over this sequence of transformations during optimization, we form a scalable, unbiased estimate of the variational lower bound on the log marginal likelihood.
4 code implementations • 19 Feb 2015 • Jasper Snoek, Oren Rippel, Kevin Swersky, Ryan Kiros, Nadathur Satish, Narayanan Sundaram, Md. Mostofa Ali Patwary, Prabhat, Ryan P. Adams
Bayesian optimization is an effective methodology for the global optimization of functions with expensive evaluations.
Ranked #164 on
Image Classification
on CIFAR-100
(using extra training data)
1 code implementation • 18 Feb 2015 • José Miguel Hernández-Lobato, Michael A. Gelbart, Matthew W. Hoffman, Ryan P. Adams, Zoubin Ghahramani
Unknown constraints arise in many types of expensive black-box optimization problems.
3 code implementations • 18 Feb 2015 • José Miguel Hernández-Lobato, Ryan P. Adams
In principle, the Bayesian approach to learning neural networks does not have these problems.
2 code implementations • 11 Feb 2015 • Dougal Maclaurin, David Duvenaud, Ryan P. Adams
Tuning hyperparameters of learning algorithms is hard because gradients are usually unavailable.
no code implementations • NeurIPS 2014 • Scott W. Linderman, Christopher H. Stock, Ryan P. Adams
Learning and memory in the brain are implemented by complex, time-varying changes in neural circuitry.
no code implementations • 28 Mar 2014 • Elaine Angelino, Eddie Kohler, Amos Waterland, Margo Seltzer, Ryan P. Adams
We present a general framework for accelerating a large class of widely used Markov chain Monte Carlo (MCMC) algorithms.
no code implementations • 22 Mar 2014 • Dougal Maclaurin, Ryan P. Adams
Markov chain Monte Carlo (MCMC) is a popular and successful general-purpose tool for Bayesian inference.
1 code implementation • 22 Mar 2014 • Michael A. Gelbart, Jasper Snoek, Ryan P. Adams
Recent work on Bayesian optimization has shown its effectiveness in global optimization of difficult black-box objective functions.
2 code implementations • 24 Feb 2014 • David Duvenaud, Oren Rippel, Ryan P. Adams, Zoubin Ghahramani
Choosing appropriate architectures and regularization strategies for deep networks is crucial to good predictive performance.
no code implementations • 20 Feb 2014 • Raja Hafiz Affandi, Emily B. Fox, Ryan P. Adams, Ben Taskar
Determinantal point processes (DPPs) are well-suited for modeling repulsion and have proven useful in many applications where diversity is desired.
1 code implementation • 5 Feb 2014 • Oren Rippel, Michael A. Gelbart, Ryan P. Adams
To learn these representations we introduce nested dropout, a procedure for stochastically removing coherent nested sets of hidden units in a neural network.
1 code implementation • 5 Feb 2014 • Jasper Snoek, Kevin Swersky, Richard S. Zemel, Ryan P. Adams
Bayesian optimization has proven to be a highly effective methodology for the global optimization of unknown, expensive and multimodal functions.
no code implementations • 4 Feb 2014 • Scott W. Linderman, Ryan P. Adams
Networks play a central role in modern data analysis, enabling us to reason about systems by studying the relationships between their parts.
no code implementations • NeurIPS 2013 • Nils E. Napp, Ryan P. Adams
We show algebraically that the steady state concentration of these species correspond to the marginal distributions of the random variables in the graph and validate the results in simulations.
no code implementations • NeurIPS 2013 • Jasper Snoek, Richard Zemel, Ryan P. Adams
Point processes are popular models of neural spiking behavior as they provide a statistical distribution over temporal sequences of spikes and help to reveal the complexities underlying a series of recorded action potentials.
1 code implementation • NeurIPS 2013 • Kevin Swersky, Jasper Snoek, Ryan P. Adams
We demonstrate the utility of this new acquisition function by utilizing a small dataset in order to explore hyperparameter settings for a large dataset.
Ranked #92 on
Image Classification
on STL-10
no code implementations • NeurIPS 2013 • James Y. Zou, Daniel J. Hsu, David C. Parkes, Ryan P. Adams
In many natural settings, the analysis goal is not to characterize a single data set in isolation, but rather to understand the difference between one set of observations and another.
no code implementations • 8 Apr 2013 • Dan Lovell, Jonathan Malmaud, Ryan P. Adams, Vikash K. Mansinghka
Applied to mixture modeling, our approach enables the Dirichlet process to simultaneously learn clusters that describe the data and superclusters that define the granularity of parallelization.
no code implementations • NeurIPS 2012 • Kevin Swersky, Ilya Sutskever, Daniel Tarlow, Richard S. Zemel, Ruslan R. Salakhutdinov, Ryan P. Adams
The Restricted Boltzmann Machine (RBM) is a popular density model that is also good for extracting features.
no code implementations • NeurIPS 2012 • James T. Kwok, Ryan P. Adams
We show how to perform MAP inference with DPP priors in latent Dirichlet allocation and in mixture models, leading to better intuition for the latent variable representation and quantitatively improved unsupervised feature extraction, without compromising the generative aspects of the model.
no code implementations • 28 Oct 2012 • Robert Nishihara, Iain Murray, Ryan P. Adams
Probabilistic models are conceptually powerful tools for finding structure in data, but their practical effectiveness is often limited by our ability to perform inference in them.
4 code implementations • NeurIPS 2012 • Jasper Snoek, Hugo Larochelle, Ryan P. Adams
In this work, we consider the automatic tuning problem within the framework of Bayesian optimization, in which a learning algorithm's generalization performance is modeled as a sample from a Gaussian process (GP).
Ranked #202 on
Image Classification
on CIFAR-10
no code implementations • NeurIPS 2010 • Zoubin Ghahramani, Michael. I. Jordan, Ryan P. Adams
Many data are naturally modeled by an unobserved hierarchical structure.
no code implementations • NeurIPS 2010 • Iain Murray, Ryan P. Adams
The Gaussian process (GP) is a popular way to specify dependencies between random variables in a probabilistic model.
no code implementations • NeurIPS 2008 • Iain Murray, David Mackay, Ryan P. Adams
Samples drawn from the GPDS are consistent with exact, independent samples from a fixed density function that is a transformation of a function drawn from a Gaussian process prior.