1 code implementation • 26 Jun 2023 • Thomas Pinetz, Erich Kobler, Robert Haase, Katerina Deike-Hofmann, Alexander Radbruch, Alexander Effland
Our numerical experiments show that conditional GANs are suitable for generating images at different GBCA dose levels and can be used to augment datasets for virtual contrast models.
no code implementations • 21 Feb 2023 • Erich Kobler, Thomas Pock
In this paper, we propose a unified framework of denoising score-based models in the context of graduated non-convex energy minimization.
no code implementations • 16 Feb 2023 • Martin Zach, Thomas Pock, Erich Kobler, Antonin Chambolle
In this work we tackle the problem of estimating the density $f_X$ of a random variable $X$ by successive smoothing, such that the smoothed random variable $Y$ fulfills $(\partial_t - \Delta_1)f_Y(\,\cdot\,, t) = 0$, $f_Y(\,\cdot\,, 0) = f_X$.
1 code implementation • Computational Optimization and Applications 2022 • Pasquale Cascarano, Giorgia Franchini, Erich Kobler, Federica Porta, Andrea Sebastiani
Numerical results demonstrate the robustness with respect to image content, noise levels and hyperparameters of the proposed models on both denoising and deblurring of simulated as well as real natural and medical images.
no code implementations • 23 Mar 2022 • Martin Zach, Erich Kobler, Thomas Pock
We apply the regularizer to limited-angle and few-view CT reconstruction problems, where it outperforms traditional reconstruction algorithms by a large margin.
no code implementations • 12 Feb 2021 • Dominik Narnhofer, Alexander Effland, Erich Kobler, Kerstin Hammernik, Florian Knoll, Thomas Pock
To this end, we solve the linear inverse problem of undersampled MRI reconstruction in a variational setting.
no code implementations • 12 Nov 2020 • Thomas Pinetz, Erich Kobler, Thomas Pock, Alexander Effland
We propose a novel learning-based framework for image reconstruction particularly designed for training without ground truth data, which has three major building blocks: energy-based learning, a patch-based Wasserstein loss functional, and shared prior learning.
no code implementations • 30 Jun 2020 • Elena A. Kaye, Emily A. Aherne, Cihan Duzgol, Ida Häggström, Erich Kobler, Yousef Mazaheri, Maggie M Fung, Zhigang Zhang, Ricardo Otazo, Herbert A. Vargas, Oguz Akin
Compared to the reference images, the denoised images received higher image quality scores (p < 0. 0001).
1 code implementation • 15 Jun 2020 • Erich Kobler, Alexander Effland, Karl Kunisch, Thomas Pock
In this work, we combine the variational formulation of inverse problems with deep learning by introducing the data-driven general-purpose total deep variation regularizer.
1 code implementation • CVPR 2020 • Erich Kobler, Alexander Effland, Karl Kunisch, Thomas Pock
Diverse inverse problems in imaging can be cast as variational problems composed of a task-specific data fidelity term and a regularization term.
no code implementations • 19 Jul 2019 • Alexander Effland, Erich Kobler, Karl Kunisch, Thomas Pock
We investigate a well-known phenomenon of variational approaches in image processing, where typically the best image quality is achieved when the gradient flow process is stopped before converging to a stationary point.
2 code implementations • 3 Apr 2017 • Kerstin Hammernik, Teresa Klatzer, Erich Kobler, Michael P. Recht, Daniel K. Sodickson, Thomas Pock, Florian Knoll
Due to its high computational performance, i. e., reconstruction time of 193 ms on a single graphics card, and the omission of parameter tuning once the network is trained, this new approach to image reconstruction can easily be integrated into clinical workflow.