Search Results for author: Joeri Hermans

Found 8 papers, 8 papers with code

Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation

1 code implementation29 Aug 2022 Arnaud Delaunoy, Joeri Hermans, François Rozet, Antoine Wehenkel, Gilles Louppe

In this work, we introduce Balanced Neural Ratio Estimation (BNRE), a variation of the NRE algorithm designed to produce posterior approximations that tend to be more conservative, hence improving their reliability, while sharing the same Bayes optimal solution.

A Trust Crisis In Simulation-Based Inference? Your Posterior Approximations Can Be Unfaithful

4 code implementations13 Oct 2021 Joeri Hermans, Arnaud Delaunoy, François Rozet, Antoine Wehenkel, Volodimir Begy, Gilles Louppe

We present extensive empirical evidence showing that current Bayesian simulation-based inference algorithms can produce computationally unfaithful posterior approximations.

Towards constraining warm dark matter with stellar streams through neural simulation-based inference

1 code implementation30 Nov 2020 Joeri Hermans, Nilanjan Banik, Christoph Weniger, Gianfranco Bertone, Gilles Louppe

A statistical analysis of the observed perturbations in the density of stellar streams can in principle set stringent contraints on the mass function of dark matter subhaloes, which in turn can be used to constrain the mass of the dark matter particle.

Bayesian Inference

Mining for Dark Matter Substructure: Inferring subhalo population properties from strong lenses with machine learning

3 code implementations4 Sep 2019 Johann Brehmer, Siddharth Mishra-Sharma, Joeri Hermans, Gilles Louppe, Kyle Cranmer

The subtle and unique imprint of dark matter substructure on extended arcs in strong lensing systems contains a wealth of information about the properties and distribution of dark matter on small scales and, consequently, about the underlying particle physics.

BIG-bench Machine Learning

Likelihood-free MCMC with Amortized Approximate Ratio Estimators

5 code implementations ICML 2020 Joeri Hermans, Volodimir Begy, Gilles Louppe

This work introduces a novel approach to address the intractability of the likelihood and the marginal model.

Gradient Energy Matching for Distributed Asynchronous Gradient Descent

2 code implementations22 May 2018 Joeri Hermans, Gilles Louppe

Distributed asynchronous SGD has become widely used for deep learning in large-scale systems, but remains notorious for its instability when increasing the number of workers.

Accumulated Gradient Normalization

1 code implementation6 Oct 2017 Joeri Hermans, Gerasimos Spanakis, Rico Möckel

This work addresses the instability in asynchronous data parallel optimization.

Adversarial Variational Optimization of Non-Differentiable Simulators

2 code implementations22 Jul 2017 Gilles Louppe, Joeri Hermans, Kyle Cranmer

We adapt the training procedure of generative adversarial networks by replacing the differentiable generative network with a domain-specific simulator.

Cannot find the paper you are looking for? You can Submit a new open access paper.