no code implementations • 11 Jul 2023 • Jack Jewson, Sahra Ghalebikesabi, Chris Holmes
To ameliorate this, we propose $\beta$D-Bayes, a posterior sampling scheme from a generalised posterior targeting the minimisation of the $\beta$-divergence between the model and the data generating process.
no code implementations • 24 Aug 2021 • Sahra Ghalebikesabi, Harrison Wilde, Jack Jewson, Arnaud Doucet, Sebastian Vollmer, Chris Holmes
Increasing interest in privacy-preserving machine learning has led to new and evolved approaches for generating private synthetic data from undisclosed real data.
no code implementations • 16 Nov 2020 • Harrison Wilde, Jack Jewson, Sebastian Vollmer, Chris Holmes
There is significant growth and interest in the use of synthetic data as an enabler for machine learning in environments where the release of real data is restricted due to privacy or availability constraints.
1 code implementation • 3 Apr 2019 • Jeremias Knoblauch, Jack Jewson, Theodoros Damoulas
We advocate an optimization-centric view on and introduce a novel generalization of Bayesian inference.
1 code implementation • NeurIPS 2018 • Jeremias Knoblauch, Jack Jewson, Theodoros Damoulas
The resulting inference procedure is doubly robust for both the parameter and the changepoint (CP) posterior, with linear time and constant space complexity.