Search Results for author: Wesley J. Maddox

Found 14 papers, 13 papers with code

Similarity of Neural Networks with Gradients

3 code implementations25 Mar 2020 Shuai Tang, Wesley J. Maddox, Charlie Dickens, Tom Diethe, Andreas Damianou

A suitable similarity index for comparing learnt neural networks plays an important role in understanding the behaviour of the highly-nonlinear functions, and can provide insights on further theoretical analysis and empirical studies.

Network Pruning

Fast Adaptation with Linearized Neural Networks

1 code implementation2 Mar 2021 Wesley J. Maddox, Shuai Tang, Pablo Garcia Moreno, Andrew Gordon Wilson, Andreas Damianou

The inductive biases of trained neural networks are difficult to understand and, consequently, to adapt to new settings.

Domain Adaptation Gaussian Processes +2

Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling

1 code implementation25 Feb 2021 Gregory W. Benton, Wesley J. Maddox, Sanae Lotfi, Andrew Gordon Wilson

In this paper, we show that there are mode-connecting simplicial complexes that form multi-dimensional manifolds of low loss, connecting many independently trained models.

Subspace Inference for Bayesian Deep Learning

1 code implementation17 Jul 2019 Pavel Izmailov, Wesley J. Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson

Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty.

Bayesian Inference Image Classification +2

Kernel Interpolation for Scalable Online Gaussian Processes

2 code implementations2 Mar 2021 Samuel Stanton, Wesley J. Maddox, Ian Delbridge, Andrew Gordon Wilson

Gaussian processes (GPs) provide a gold standard for performance in online settings, such as sample-efficient control and black box optimization, where we need to update a posterior distribution as we acquire data in a sequential fashion.

Bayesian Optimization Gaussian Processes

Rethinking Parameter Counting in Deep Models: Effective Dimensionality Revisited

1 code implementation4 Mar 2020 Wesley J. Maddox, Gregory Benton, Andrew Gordon Wilson

Neural networks appear to have mysterious generalization properties when using parameter counting as a proxy for complexity.

Model Selection

Function-Space Distributions over Kernels

1 code implementation NeurIPS 2019 Gregory W. Benton, Wesley J. Maddox, Jayson P. Salkey, Julio Albinati, Andrew Gordon Wilson

The resulting approach enables learning of rich representations, with support for any stationary kernel, uncertainty over the values of the kernel, and an interpretable specification of a prior directly over kernels, without requiring sophisticated initialization or manual intervention.

Gaussian Processes Representation Learning

Conditioning Sparse Variational Gaussian Processes for Online Decision-making

1 code implementation NeurIPS 2021 Wesley J. Maddox, Samuel Stanton, Andrew Gordon Wilson

With a principled representation of uncertainty and closed form posterior updates, Gaussian processes (GPs) are a natural choice for online decision making.

Active Learning Decision Making +1

On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification

1 code implementation30 Mar 2022 Sanyam Kapoor, Wesley J. Maddox, Pavel Izmailov, Andrew Gordon Wilson

In Bayesian regression, we often use a Gaussian observation model, where we control the level of aleatoric uncertainty with a noise variance parameter.

Classification Data Augmentation

Bayesian Optimization with High-Dimensional Outputs

2 code implementations NeurIPS 2021 Wesley J. Maddox, Maximilian Balandat, Andrew Gordon Wilson, Eytan Bakshy

However, the Gaussian Process (GP) models typically used as probabilistic surrogates for multi-task Bayesian Optimization scale poorly with the number of outcomes, greatly limiting applicability.

Bayesian Optimization Vocal Bursts Intensity Prediction

Low-Precision Arithmetic for Fast Gaussian Processes

1 code implementation14 Jul 2022 Wesley J. Maddox, Andres Potapczynski, Andrew Gordon Wilson

Low-precision arithmetic has had a transformative effect on the training of neural networks, reducing computation, memory and energy requirements.

Gaussian Processes

When are Iterative Gaussian Processes Reliably Accurate?

1 code implementation31 Dec 2021 Wesley J. Maddox, Sanyam Kapoor, Andrew Gordon Wilson

While recent work on conjugate gradient methods and Lanczos decompositions have achieved scalable Gaussian process inference with highly accurate point predictions, in several implementations these iterative methods appear to struggle with numerical instabilities in learning kernel hyperparameters, and poor test likelihoods.

Gaussian Processes

Materials Expert-Artificial Intelligence for Materials Discovery

no code implementations5 Dec 2023 Yanjun Liu, Milena Jovanovic, Krishnanand Mallayya, Wesley J. Maddox, Andrew Gordon Wilson, Sebastian Klemenz, Leslie M. Schoop, Eun-Ah Kim

The advent of material databases provides an unprecedented opportunity to uncover predictive descriptors for emergent material properties from vast data space.

Cannot find the paper you are looking for? You can Submit a new open access paper.