1 code implementation • 5 Feb 2024 • Paul Hagemann, Johannes Hertrich, Maren Casfor, Sebastian Heidenreich, Gabriele Steidl
Motivated by indirect measurements and applications from nanometrology with a mixed noise model, we develop a novel algorithm for jointly estimating the posterior and the noise parameters in Bayesian inverse problems.
no code implementations • 16 Jan 2024 • Johannes Hertrich
Kernel-based methods are heavily used in machine learning.
no code implementations • 27 Dec 2023 • Moritz Piening, Fabian Altekrüger, Johannes Hertrich, Paul Hagemann, Andrea Walther, Gabriele Steidl
The solution of inverse problems is of fundamental interest in medical and astronomical imaging, geophysics as well as engineering and life sciences.
1 code implementation • 4 Oct 2023 • Paul Hagemann, Johannes Hertrich, Fabian Altekrüger, Robert Beinert, Jannis Chemseddine, Gabriele Steidl
We propose conditional flows of the maximum mean discrepancy (MMD) with the negative distance kernel for posterior sampling and conditional generative modeling.
1 code implementation • 19 May 2023 • Johannes Hertrich, Christian Wald, Fabian Altekrüger, Paul Hagemann
We prove that the MMD of Riesz kernels, which is also known as energy distance, coincides with the MMD of their sliced version.
1 code implementation • 27 Mar 2023 • Giovanni S. Alberti, Johannes Hertrich, Matteo Santacesaria, Silvia Sciutto
Representing a manifold of very high-dimensional data with generative models has been shown to be computationally efficient in practice.
1 code implementation • 27 Jan 2023 • Fabian Altekrüger, Johannes Hertrich, Gabriele Steidl
Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals with non-smooth Riesz kernels show a rich structure as singular measures can become absolutely continuous ones and conversely.
1 code implementation • 30 Nov 2022 • Johannes Hertrich
Normalizing flows are a powerful tool for generative modelling, density estimation and posterior reconstruction in Bayesian inverse problems.
1 code implementation • 24 May 2022 • Fabian Altekrüger, Alexander Denker, Paul Hagemann, Johannes Hertrich, Peter Maass, Gabriele Steidl
Learning neural networks using only few available information is an important ongoing research topic with tremendous potential for applications.
1 code implementation • 20 Jan 2022 • Fabian Altekrüger, Johannes Hertrich
Exploiting image patches instead of whole images have proved to be a powerful approach to tackle various problems in image processing.
1 code implementation • 24 Nov 2021 • Paul Hagemann, Johannes Hertrich, Gabriele Steidl
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models.
1 code implementation • 27 Sep 2021 • Johannes Hertrich, Antoine Houdard, Claudia Redenbach
Then, the proposed regularizer penalizes the $W_2$-distance of the patch distribution of the reconstruction to the patch distribution of some reference image at different scales.
1 code implementation • 23 Sep 2021 • Paul Hagemann, Johannes Hertrich, Gabriele Steidl
To overcome topological constraints and improve the expressiveness of normalizing flow architectures, Wu, K\"ohler and No\'e introduced stochastic normalizing flows which combine deterministic, learnable flow transformations with stochastic sampling methods.
1 code implementation • 4 Nov 2020 • Johannes Hertrich, Sebastian Neumayer, Gabriele Steidl
In this paper, we introduce convolutional proximal neural networks (cPNNs), which are by construction averaged operators.
1 code implementation • 16 Sep 2020 • Johannes Hertrich, Dang Phoung Lan Nguyen, Jean-Fancois Aujol, Dominique Bernard, Yannick Berthoumieu, Abdellatif Saadaldin, Gabriele Steidl
To learn the (low dimensional) parameters of the mixture model we propose an EM algorithm whose M-step requires the solution of constrained optimization problems.