no code implementations • 27 Mar 2024 • Jannis Chemseddine, Paul Hagemann, Christian Wald, Gabriele Steidl
In inverse problems, many conditional generative models approximate the posterior measure by minimizing a distance between the joint measure and its learned approximation.
no code implementations • 13 Feb 2024 • Florian Beier, Hancheng Bi, Clément Sarrazin, Bernhard Schmitzer, Gabriele Steidl
In this paper, we are concerned with estimating the joint probability of random variables $X$ and $Y$, given $N$ independent observation blocks $(\boldsymbol{x}^i,\boldsymbol{y}^i)$, $i=1,\ldots, N$, each of $M$ samples $(\boldsymbol{x}^i,\boldsymbol{y}^i) = \bigl((x^i_j, y^i_{\sigma^i(j)}) \bigr)_{j=1}^M$, where $\sigma^i$ denotes an unknown permutation of i. i. d.
1 code implementation • 7 Feb 2024 • Sebastian Neumayer, Viktor Stein, Gabriele Steidl, Nicolaj Rux
In this paper, we use the so-called kernel mean embedding to show that the corresponding regularization can be rewritten as the Moreau envelope of some function in the reproducing kernel Hilbert space associated with $K$.
1 code implementation • 5 Feb 2024 • Paul Hagemann, Johannes Hertrich, Maren Casfor, Sebastian Heidenreich, Gabriele Steidl
Motivated by indirect measurements and applications from nanometrology with a mixed noise model, we develop a novel algorithm for jointly estimating the posterior and the noise parameters in Bayesian inverse problems.
no code implementations • 25 Jan 2024 • Martin Hanik, Gabriele Steidl, Christoph von Tycowicz
We propose two graph neural network layers for graphs with features in a Riemannian manifold.
no code implementations • 27 Dec 2023 • Moritz Piening, Fabian Altekrüger, Johannes Hertrich, Paul Hagemann, Andrea Walther, Gabriele Steidl
The solution of inverse problems is of fundamental interest in medical and astronomical imaging, geophysics as well as engineering and life sciences.
1 code implementation • 4 Oct 2023 • Paul Hagemann, Johannes Hertrich, Fabian Altekrüger, Robert Beinert, Jannis Chemseddine, Gabriele Steidl
We propose conditional flows of the maximum mean discrepancy (MMD) with the negative distance kernel for posterior sampling and conditional generative modeling.
no code implementations • 28 Mar 2023 • Fabian Altekrüger, Paul Hagemann, Gabriele Steidl
Conditional generative models became a very powerful tool to sample from Bayesian inverse problem posteriors.
1 code implementation • 8 Mar 2023 • Paul Hagemann, Sophie Mildenberger, Lars Ruthotto, Gabriele Steidl, Nicole Tianjiao Yang
We thereby intend to obtain diffusion models that generalize across different resolution levels and improve the efficiency of the training process.
1 code implementation • 27 Jan 2023 • Fabian Altekrüger, Johannes Hertrich, Gabriele Steidl
Wasserstein gradient flows of maximum mean discrepancy (MMD) functionals with non-smooth Riesz kernels show a rich structure as singular measures can become absolutely continuous ones and conversely.
1 code implementation • 24 May 2022 • Fabian Altekrüger, Alexander Denker, Paul Hagemann, Johannes Hertrich, Peter Maass, Gabriele Steidl
Learning neural networks using only few available information is an important ongoing research topic with tremendous potential for applications.
1 code implementation • 15 Apr 2022 • Philipp Flotho, Cosmas Heiss, Gabriele Steidl, Daniel J. Strauss
In this paper, we propose a novel approach for local Lagrangian motion magnification of facial micro-motions.
1 code implementation • 24 Nov 2021 • Paul Hagemann, Johannes Hertrich, Gabriele Steidl
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models.
1 code implementation • 23 Sep 2021 • Paul Hagemann, Johannes Hertrich, Gabriele Steidl
To overcome topological constraints and improve the expressiveness of normalizing flow architectures, Wu, K\"ohler and No\'e introduced stochastic normalizing flows which combine deterministic, learnable flow transformations with stochastic sampling methods.
no code implementations • 5 Feb 2021 • Anna Andrle, Nando Farchmin, Paul Hagemann, Sebastian Heidenreich, Victor Soltwisch, Gabriele Steidl
Grazing incidence X-ray fluorescence is a non-destructive technique for analyzing the geometry and compositional parameters of nanostructures appearing e. g. in computer chips.
no code implementations • 27 Jan 2021 • Robert Beinert, Peter Jung, Gabriele Steidl, Tom Szollmann
In this work we consider the problem of identification and reconstruction of doubly-dispersive channel operators which are given by finite linear combinations of time-frequency shifts.
Super-Resolution Information Theory Numerical Analysis Information Theory Numerical Analysis 47A62, 65R30, 65T99, 94A20
1 code implementation • 4 Nov 2020 • Johannes Hertrich, Sebastian Neumayer, Gabriele Steidl
In this paper, we introduce convolutional proximal neural networks (cPNNs), which are by construction averaged operators.
1 code implementation • 16 Sep 2020 • Johannes Hertrich, Dang Phoung Lan Nguyen, Jean-Fancois Aujol, Dominique Bernard, Yannick Berthoumieu, Abdellatif Saadaldin, Gabriele Steidl
To learn the (low dimensional) parameters of the mixture model we propose an EM algorithm whose M-step requires the solution of constrained optimization problems.
no code implementations • 26 Jul 2018 • Xiaohao Cai, Raymond Chan, Carola-Bibiane Schonlieb, Gabriele Steidl, Tieyong Zeng
The piecewise constant Mumford-Shah (PCMS) model and the Rudin-Osher-Fatemi (ROF) model are two important variational models in image segmentation and image restoration, respectively.
no code implementations • 28 Jul 2016 • Friederike Laus, Mila Nikolova, Johannes Persch, Gabriele Steidl
This paper is the first attempt to generalize this technique to manifold-valued images.
1 code implementation • 8 Jun 2015 • Miroslav Bačák, Ronny Bergmann, Gabriele Steidl, Andreas Weinmann
We introduce a new non-smooth variational model for the restoration of manifold-valued data which includes second order differences in the regularization term.
Numerical Analysis 65K10, 49Q99, 49M37
no code implementations • 13 Dec 2014 • Martin Burger, Alex Sawatzky, Gabriele Steidl
Variational methods in imaging are nowadays developing towards a quite universal and flexible tool, allowing for highly successful approaches on tasks like denoising, deblurring, inpainting, segmentation, super-resolution, disparity, and optical flow estimation.