no code implementations • 23 May 2022 • David Yallup, Will Handley, Mike Hobson, Anthony Lasenby, Pablo Lemos
The true posterior distribution of a Bayesian neural network is massively multimodal.
1 code implementation • 25 Apr 2020 • Kamran Javid, Will Handley, Mike Hobson, Anthony Lasenby
We conduct a thorough analysis of the relationship between the out-of-sample performance and the Bayesian evidence (marginal likelihood) of Bayesian neural networks (BNNs), as well as looking at the performance of ensembles of BNNs, both using the Boston housing dataset.
no code implementations • 25 Jan 2019 • Xi Chen, Mike Hobson
The estimation of unknown values of parameters (or hidden variables, control variables) that characterise a physical system often relies on the comparison of measured data with synthetic data produced by some numerical simulator of the system as the parameter values are varied.
3 code implementations • 16 Apr 2018 • Edward Higson, Will Handley, Mike Hobson, Anthony Lasenby
Nested sampling is an increasingly popular technique for Bayesian computation - in particular for multimodal, degenerate and high-dimensional problems.
Computation Cosmology and Nongalactic Astrophysics Instrumentation and Methods for Astrophysics Data Analysis, Statistics and Probability
2 code implementations • 11 Apr 2017 • Edward Higson, Will Handley, Mike Hobson, Anthony Lasenby
We introduce dynamic nested sampling: a generalisation of the nested sampling algorithm in which the number of "live points" varies to allocate samples more efficiently.
Computation Instrumentation and Methods for Astrophysics Data Analysis, Statistics and Probability Methodology
2 code implementations • 28 Mar 2017 • Edward Higson, Will Handley, Mike Hobson, Anthony Lasenby
Sampling errors in nested sampling parameter estimation differ from those in Bayesian evidence calculation, but have been little studied in the literature.
Methodology Instrumentation and Methods for Astrophysics Applications