1 code implementation • 9 Jun 2023 • Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Xingchen Wan, Vu Nguyen, Harald Oberhauser, Michael A. Osborne
Active learning parallelization is widely used, but typically relies on fixing the batch size throughout experimentation.
1 code implementation • 15 Mar 2023 • Saad Hamid, Xingchen Wan, Martin Jørgensen, Binxin Ru, Michael Osborne
Ensembling can improve the performance of Neural Networks, but existing approaches struggle when the architecture likelihood surface has dispersed, narrow peaks.
1 code implementation • 27 Jan 2023 • Masaki Adachi, Satoshi Hayakawa, Saad Hamid, Martin Jørgensen, Harald Oberhauser, Micheal A. Osborne
Batch Bayesian optimisation and Bayesian quadrature have been shown to be sample-efficient methods of performing optimisation and quadrature where expensive-to-evaluate objective functions can be queried in parallel.
no code implementations • 1 Sep 2022 • Martin Jørgensen, Michael A. Osborne
We introduce a kernel that allows the number of summarising variables to grow exponentially with the number of input features, but requires only linear cost in both number of observations and input features.
2 code implementations • 9 Jun 2022 • Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Harald Oberhauser, Michael A. Osborne
Empirically, we find that our approach significantly outperforms the sampling efficiency of both state-of-the-art BQ techniques and Nested Sampling in various real-world datasets, including lithium-ion battery analytics.
1 code implementation • 14 Jun 2021 • Pola Schwöbel, Martin Jørgensen, Sebastian W. Ober, Mark van der Wilk
Computing the marginal likelihood is hard for neural networks, but success with tractable approaches that compute the marginal likelihood for the last layer only raises the question of whether this convenient approach might be employed for learning invariances.
no code implementations • ICCV 2021 • Frederik Warburg, Martin Jørgensen, Javier Civera, Søren Hauberg
Uncertainty quantification in image retrieval is crucial for downstream decisions, yet it remains a challenging and largely unexplored problem.
no code implementations • 12 Aug 2020 • Martin Jørgensen, Søren Hauberg
This study investigates one such invariant: the causal relationship between X and Y is invariant to the marginal distributions of X and Y.
1 code implementation • ICML 2020 • Martin Jørgensen, Marc Peter Deisenroth, Hugh Salimbeni
We present a Bayesian non-parametric way of inferring stochastic differential equations for both regression tasks and continuous-time dynamical modelling.
no code implementations • 21 Jun 2020 • Martin Jørgensen, Søren Hauberg
We present a probabilistic model where the latent variable respects both the distances and the topology of the modeled data.
1 code implementation • 7 Apr 2020 • Pola Schwöbel, Frederik Warburg, Martin Jørgensen, Kristoffer H. Madsen, Søren Hauberg
Spatial Transformer Networks (STNs) estimate image transformations that can improve downstream tasks by `zooming in' on relevant regions in an image.
2 code implementations • NeurIPS 2019 • Nicki S. Detlefsen, Martin Jørgensen, Søren Hauberg
We propose and investigate new complementary methodologies for estimating predictive variance networks in regression neural networks.