Search Results for author: Andrew B. Duncan

Found 17 papers, 8 papers with code

Training Discrete Energy-Based Models with Energy Discrepancy

no code implementations14 Jul 2023 Tobias Schröder, Zijing Ou, Yingzhen Li, Andrew B. Duncan

Training energy-based models (EBMs) on discrete spaces is challenging because sampling over such spaces can be difficult.

Energy Discrepancies: A Score-Independent Loss for Energy-Based Models

1 code implementation NeurIPS 2023 Tobias Schröder, Zijing Ou, Jen Ning Lim, Yingzhen Li, Sebastian J. Vollmer, Andrew B. Duncan

Energy-based models are a simple yet powerful class of probabilistic models, but their widespread adoption has been limited by the computational burden of training them.

Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized Stein Discrepancy

1 code implementation28 Apr 2023 Xing Liu, Andrew B. Duncan, Axel Gandy

Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely used in goodness-of-fit tests.

A High-dimensional Convergence Theorem for U-statistics with Applications to Kernel-based Testing

no code implementations11 Feb 2023 Kevin H. Huang, Xing Liu, Andrew B. Duncan, Axel Gandy

We prove a convergence theorem for U-statistics of degree two, where the data dimension $d$ is allowed to scale with sample size $n$.

valid

Batch Bayesian Optimization via Particle Gradient Flows

1 code implementation10 Sep 2022 Enrico Crovini, Simon L. Cotter, Konstantinos Zygalakis, Andrew B. Duncan

In this work we reformulate batch BO as an optimisation problem over the space of probability measures.

Bayesian Inference

Ensemble Inference Methods for Models With Noisy and Expensive Likelihoods

no code implementations7 Apr 2021 Oliver R. A. Dunbar, Andrew B. Duncan, Andrew M. Stuart, Marie-Therese Wolfram

The ensemble Kalman methods are shown to behave favourably in the presence of noise in the parameter-to-data map, whereas Langevin methods are adversely affected.

Blade Envelopes Part II: Multiple Objectives and Inverse Design

1 code implementation31 Dec 2020 Chun Yui Wong, Pranay Seshadri, Ashley Scillitoe, Bryn Noel Ubald, Andrew B. Duncan, Geoffrey Parks

In Part I of this two-part paper, a workflow for the formulation of blade envelopes is described and demonstrated.

Computational Engineering, Finance, and Science

Blade Envelopes Part I: Concept and Methodology

1 code implementation22 Nov 2020 Chun Yui Wong, Pranay Seshadri, Ashley Scillitoe, Andrew B. Duncan, Geoffrey Parks

Blades manufactured through flank and point milling will likely exhibit geometric variability.

Dimensionality Reduction Computational Engineering, Finance, and Science Applications

Physics-constrained Bayesian inference of state functions in classical density-functional theory

no code implementations7 Oct 2020 Peter Yatsyshin, Serafim Kalliadasis, Andrew B. Duncan

In our case, the output of the learning algorithm is a probability distribution over a family of free energy functionals, consistent with the observed particle data.

Bayesian Inference Uncertainty Quantification

Probabilistic Gradients for Fast Calibration of Differential Equation Models

no code implementations3 Sep 2020 Jon Cockayne, Andrew B. Duncan

Calibration of large-scale differential equation models to observational or experimental data is a widespread challenge throughout applied sciences and engineering.

A Kernel Two-Sample Test for Functional Data

1 code implementation25 Aug 2020 George Wynne, Andrew B. Duncan

We propose a nonparametric two-sample test procedure based on Maximum Mean Discrepancy (MMD) for testing the hypothesis that two samples of functions have the same underlying distribution, using kernels defined on function spaces.

Vocal Bursts Valence Prediction

Minimum Stein Discrepancy Estimators

no code implementations NeurIPS 2019 Alessandro Barp, Francois-Xavier Briol, Andrew B. Duncan, Mark Girolami, Lester Mackey

We provide a unifying perspective of these techniques as minimum Stein discrepancy estimators, and use this lens to design new diffusion kernel Stein discrepancy (DKSD) and diffusion score matching (DSM) estimators with complementary strengths.

Statistical Inference for Generative Models with Maximum Mean Discrepancy

no code implementations13 Jun 2019 Francois-Xavier Briol, Alessandro Barp, Andrew B. Duncan, Mark Girolami

While likelihood-based inference and its variants provide a statistically efficient and widely applicable approach to parametric inference, their application to models involving intractable likelihoods poses challenges.

Piecewise Deterministic Markov Processes for Scalable Monte Carlo on Restricted Domains

4 code implementations16 Jan 2017 Joris Bierkens, Alexandre Bouchard-Côté, Arnaud Doucet, Andrew B. Duncan, Paul Fearnhead, Thibaut Lienart, Gareth Roberts, Sebastian J. Vollmer

Piecewise Deterministic Monte Carlo algorithms enable simulation from a posterior distribution, whilst only needing to access a sub-sample of data at each iteration.

Methodology Computation

Measuring Sample Quality with Diffusions

no code implementations21 Nov 2016 Jackson Gorham, Andrew B. Duncan, Sebastian J. Vollmer, Lester Mackey

Stein's method for measuring convergence to a continuous target distribution relies on an operator characterizing the target and Stein factor bounds on the solutions of an associated differential equation.

Cannot find the paper you are looking for? You can Submit a new open access paper.