1 code implementation • 5 Jul 2022 • Tianle Zhang, Wenjie Ruan, Jonathan E. Fieldsend
Our experiments demonstrate the effectiveness and flexibility of PRoA in terms of evaluating the probabilistic robustness against a broad range of functional perturbations, and PRoA can scale well to various large-scale deep neural networks compared to existing state-of-the-art baselines.
no code implementations • 1 Mar 2022 • Gregory A. Daly, Jonathan E. Fieldsend, Gavin Tabor
Recent work on regularised and entropic autoencoders have begun to explore the potential, for generative modelling, of removing the variational approach and returning to the classic deterministic autoencoder (DAE) with additional novel regularisation methods.
1 code implementation • 15 Oct 2020 • George De Ath, Richard M. Everson, Jonathan E. Fieldsend
Batch Bayesian optimisation (BO) is a successful technique for the optimisation of expensive black-box functions.
1 code implementation • 17 Apr 2020 • George De Ath, Jonathan E. Fieldsend, Richard M. Everson
We show that the rate of convergence can depend sensitively on the choice of mean function.
1 code implementation • 5 Feb 2020 • George De Ath, Richard M. Everson, Jonathan E. Fieldsend, Alma A. M. Rahat
Bayesian optimisation is a popular, surrogate model-based approach for optimising expensive black-box functions.
1 code implementation • 28 Nov 2019 • George De Ath, Richard M. Everson, Alma A. M. Rahat, Jonathan E. Fieldsend
The performance of acquisition functions for Bayesian optimisation to locate the global optimum of continuous functions is investigated in terms of the Pareto front between exploration and exploitation.
no code implementations • 25 Apr 2019 • Nicholas D. Sanders, Richard M. Everson, Jonathan E. Fieldsend, Alma A. M. Rahat
We propose a method for robust optimisation using Bayesian optimisation to find a region of design space in which the expensive function's performance is relatively insensitive to the inputs whilst retaining a good quality.