Search Results for author: Johannes Hertrich

Found 15 papers, 13 papers with code

Mixed Noise and Posterior Estimation with Conditional DeepGEM

1 code implementation5 Feb 2024 Paul Hagemann, Johannes Hertrich, Maren Casfor, Sebastian Heidenreich, Gabriele Steidl

Motivated by indirect measurements and applications from nanometrology with a mixed noise model, we develop a novel algorithm for jointly estimating the posterior and the noise parameters in Bayesian inverse problems.

Learning from small data sets: Patch-based regularizers in inverse problems for image reconstruction

no code implementations27 Dec 2023 Moritz Piening, Fabian Altekrüger, Johannes Hertrich, Paul Hagemann, Andrea Walther, Gabriele Steidl

The solution of inverse problems is of fundamental interest in medical and astronomical imaging, geophysics as well as engineering and life sciences.

Geophysics Image Reconstruction +2

Posterior Sampling Based on Gradient Flows of the MMD with Negative Distance Kernel

1 code implementation4 Oct 2023 Paul Hagemann, Johannes Hertrich, Fabian Altekrüger, Robert Beinert, Jannis Chemseddine, Gabriele Steidl

We propose conditional flows of the maximum mean discrepancy (MMD) with the negative distance kernel for posterior sampling and conditional generative modeling.

Conditional Image Generation

Generative Sliced MMD Flows with Riesz Kernels

1 code implementation19 May 2023 Johannes Hertrich, Christian Wald, Fabian Altekrüger, Paul Hagemann

We prove that the MMD of Riesz kernels, which is also known as energy distance, coincides with the MMD of their sliced version.

Image Generation

Manifold Learning by Mixture Models of VAEs for Inverse Problems

1 code implementation27 Mar 2023 Giovanni S. Alberti, Johannes Hertrich, Matteo Santacesaria, Silvia Sciutto

Representing a manifold of very high-dimensional data with generative models has been shown to be computationally efficient in practice.

Deblurring

Neural Wasserstein Gradient Flows for Maximum Mean Discrepancies with Riesz Kernels

1 code implementation27 Jan 2023 Fabian Altekrüger, Johannes Hertrich, Gabriele Steidl

Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals with non-smooth Riesz kernels show a rich structure as singular measures can become absolutely continuous ones and conversely.

Proximal Residual Flows for Bayesian Inverse Problems

1 code implementation30 Nov 2022 Johannes Hertrich

Normalizing flows are a powerful tool for generative modelling, density estimation and posterior reconstruction in Bayesian inverse problems.

Density Estimation

PatchNR: Learning from Very Few Images by Patch Normalizing Flow Regularization

1 code implementation24 May 2022 Fabian Altekrüger, Alexander Denker, Paul Hagemann, Johannes Hertrich, Peter Maass, Gabriele Steidl

Learning neural networks using only few available information is an important ongoing research topic with tremendous potential for applications.

Computed Tomography (CT)

WPPNets and WPPFlows: The Power of Wasserstein Patch Priors for Superresolution

1 code implementation20 Jan 2022 Fabian Altekrüger, Johannes Hertrich

Exploiting image patches instead of whole images have proved to be a powerful approach to tackle various problems in image processing.

Uncertainty Quantification

Generalized Normalizing Flows via Markov Chains

1 code implementation24 Nov 2021 Paul Hagemann, Johannes Hertrich, Gabriele Steidl

Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models.

Wasserstein Patch Prior for Image Superresolution

1 code implementation27 Sep 2021 Johannes Hertrich, Antoine Houdard, Claudia Redenbach

Then, the proposed regularizer penalizes the $W_2$-distance of the patch distribution of the reconstruction to the patch distribution of some reference image at different scales.

Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint

1 code implementation23 Sep 2021 Paul Hagemann, Johannes Hertrich, Gabriele Steidl

To overcome topological constraints and improve the expressiveness of normalizing flow architectures, Wu, K\"ohler and No\'e introduced stochastic normalizing flows which combine deterministic, learnable flow transformations with stochastic sampling methods.

Convolutional Proximal Neural Networks and Plug-and-Play Algorithms

1 code implementation4 Nov 2020 Johannes Hertrich, Sebastian Neumayer, Gabriele Steidl

In this paper, we introduce convolutional proximal neural networks (cPNNs), which are by construction averaged operators.

Denoising

PCA Reduced Gaussian Mixture Models with Applications in Superresolution

1 code implementation16 Sep 2020 Johannes Hertrich, Dang Phoung Lan Nguyen, Jean-Fancois Aujol, Dominique Bernard, Yannick Berthoumieu, Abdellatif Saadaldin, Gabriele Steidl

To learn the (low dimensional) parameters of the mixture model we propose an EM algorithm whose M-step requires the solution of constrained optimization problems.

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.