Search Results for author: Gabriele Steidl

Found 22 papers, 12 papers with code

Conditional Wasserstein Distances with Applications in Bayesian OT Flow Matching

no code implementations27 Mar 2024 Jannis Chemseddine, Paul Hagemann, Christian Wald, Gabriele Steidl

In inverse problems, many conditional generative models approximate the posterior measure by minimizing a distance between the joint measure and its learned approximation.

Conditional Image Generation

Transfer Operators from Batches of Unpaired Points via Entropic Transport Kernels

no code implementations13 Feb 2024 Florian Beier, Hancheng Bi, Clément Sarrazin, Bernhard Schmitzer, Gabriele Steidl

In this paper, we are concerned with estimating the joint probability of random variables $X$ and $Y$, given $N$ independent observation blocks $(\boldsymbol{x}^i,\boldsymbol{y}^i)$, $i=1,\ldots, N$, each of $M$ samples $(\boldsymbol{x}^i,\boldsymbol{y}^i) = \bigl((x^i_j, y^i_{\sigma^i(j)}) \bigr)_{j=1}^M$, where $\sigma^i$ denotes an unknown permutation of i. i. d.

Wasserstein Gradient Flows for Moreau Envelopes of f-Divergences in Reproducing Kernel Hilbert Spaces

1 code implementation7 Feb 2024 Sebastian Neumayer, Viktor Stein, Gabriele Steidl, Nicolaj Rux

In this paper, we use the so-called kernel mean embedding to show that the corresponding regularization can be rewritten as the Moreau envelope of some function in the reproducing kernel Hilbert space associated with $K$.

Mixed Noise and Posterior Estimation with Conditional DeepGEM

1 code implementation5 Feb 2024 Paul Hagemann, Johannes Hertrich, Maren Casfor, Sebastian Heidenreich, Gabriele Steidl

Motivated by indirect measurements and applications from nanometrology with a mixed noise model, we develop a novel algorithm for jointly estimating the posterior and the noise parameters in Bayesian inverse problems.

Learning from small data sets: Patch-based regularizers in inverse problems for image reconstruction

no code implementations27 Dec 2023 Moritz Piening, Fabian Altekrüger, Johannes Hertrich, Paul Hagemann, Andrea Walther, Gabriele Steidl

The solution of inverse problems is of fundamental interest in medical and astronomical imaging, geophysics as well as engineering and life sciences.

Geophysics Image Reconstruction +2

Posterior Sampling Based on Gradient Flows of the MMD with Negative Distance Kernel

1 code implementation4 Oct 2023 Paul Hagemann, Johannes Hertrich, Fabian Altekrüger, Robert Beinert, Jannis Chemseddine, Gabriele Steidl

We propose conditional flows of the maximum mean discrepancy (MMD) with the negative distance kernel for posterior sampling and conditional generative modeling.

Conditional Image Generation

Conditional Generative Models are Provably Robust: Pointwise Guarantees for Bayesian Inverse Problems

no code implementations28 Mar 2023 Fabian Altekrüger, Paul Hagemann, Gabriele Steidl

Conditional generative models became a very powerful tool to sample from Bayesian inverse problem posteriors.

Multilevel Diffusion: Infinite Dimensional Score-Based Diffusion Models for Image Generation

1 code implementation8 Mar 2023 Paul Hagemann, Sophie Mildenberger, Lars Ruthotto, Gabriele Steidl, Nicole Tianjiao Yang

We thereby intend to obtain diffusion models that generalize across different resolution levels and improve the efficiency of the training process.

Image Generation

Neural Wasserstein Gradient Flows for Maximum Mean Discrepancies with Riesz Kernels

1 code implementation27 Jan 2023 Fabian Altekrüger, Johannes Hertrich, Gabriele Steidl

Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals with non-smooth Riesz kernels show a rich structure as singular measures can become absolutely continuous ones and conversely.

PatchNR: Learning from Very Few Images by Patch Normalizing Flow Regularization

1 code implementation24 May 2022 Fabian Altekrüger, Alexander Denker, Paul Hagemann, Johannes Hertrich, Peter Maass, Gabriele Steidl

Learning neural networks using only few available information is an important ongoing research topic with tremendous potential for applications.

Computed Tomography (CT)

Generalized Normalizing Flows via Markov Chains

1 code implementation24 Nov 2021 Paul Hagemann, Johannes Hertrich, Gabriele Steidl

Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models.

Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint

1 code implementation23 Sep 2021 Paul Hagemann, Johannes Hertrich, Gabriele Steidl

To overcome topological constraints and improve the expressiveness of normalizing flow architectures, Wu, K\"ohler and No\'e introduced stochastic normalizing flows which combine deterministic, learnable flow transformations with stochastic sampling methods.

Invertible Neural Networks versus MCMC for Posterior Reconstruction in Grazing Incidence X-Ray Fluorescence

no code implementations5 Feb 2021 Anna Andrle, Nando Farchmin, Paul Hagemann, Sebastian Heidenreich, Victor Soltwisch, Gabriele Steidl

Grazing incidence X-ray fluorescence is a non-destructive technique for analyzing the geometry and compositional parameters of nanostructures appearing e. g. in computer chips.

Super-Resolution for Doubly-Dispersive Channel Estimation

no code implementations27 Jan 2021 Robert Beinert, Peter Jung, Gabriele Steidl, Tom Szollmann

In this work we consider the problem of identification and reconstruction of doubly-dispersive channel operators which are given by finite linear combinations of time-frequency shifts.

Super-Resolution Information Theory Numerical Analysis Information Theory Numerical Analysis 47A62, 65R30, 65T99, 94A20

Convolutional Proximal Neural Networks and Plug-and-Play Algorithms

1 code implementation4 Nov 2020 Johannes Hertrich, Sebastian Neumayer, Gabriele Steidl

In this paper, we introduce convolutional proximal neural networks (cPNNs), which are by construction averaged operators.

Denoising

PCA Reduced Gaussian Mixture Models with Applications in Superresolution

1 code implementation16 Sep 2020 Johannes Hertrich, Dang Phoung Lan Nguyen, Jean-Fancois Aujol, Dominique Bernard, Yannick Berthoumieu, Abdellatif Saadaldin, Gabriele Steidl

To learn the (low dimensional) parameters of the mixture model we propose an EM algorithm whose M-step requires the solution of constrained optimization problems.

Dimensionality Reduction

Linkage between piecewise constant Mumford-Shah model and ROF model and its virtue in image segmentation

no code implementations26 Jul 2018 Xiaohao Cai, Raymond Chan, Carola-Bibiane Schonlieb, Gabriele Steidl, Tieyong Zeng

The piecewise constant Mumford-Shah (PCMS) model and the Rudin-Osher-Fatemi (ROF) model are two important variational models in image segmentation and image restoration, respectively.

Image Restoration Image Segmentation +3

A Second Order Non-Smooth Variational Model for Restoring Manifold-Valued Images

1 code implementation8 Jun 2015 Miroslav Bačák, Ronny Bergmann, Gabriele Steidl, Andreas Weinmann

We introduce a new non-smooth variational model for the restoration of manifold-valued data which includes second order differences in the regularization term.

Numerical Analysis 65K10, 49Q99, 49M37

First order algorithms in variational image processing

no code implementations13 Dec 2014 Martin Burger, Alex Sawatzky, Gabriele Steidl

Variational methods in imaging are nowadays developing towards a quite universal and flexible tool, allowing for highly successful approaches on tasks like denoising, deblurring, inpainting, segmentation, super-resolution, disparity, and optical flow estimation.

Deblurring Denoising +2

Cannot find the paper you are looking for? You can Submit a new open access paper.