1 code implementation • 18 Jul 2024 • Frederik Hoppe, Claudio Mayrink Verdun, Hannah Laus, Felix Krahmer, Holger Rauhut
We develop a new data-driven approach for UQ in regression that applies both to classical regression approaches such as the LASSO as well as to neural networks.
no code implementations • 18 Jul 2024 • Frederik Hoppe, Felix Krahmer, Claudio Mayrink Verdun, Marion Menzel, Holger Rauhut
One of the most promising solutions for uncertainty quantification in high-dimensional statistics is the debiased LASSO that relies on unconstrained $\ell_1$-minimization.
no code implementations • 18 Jul 2024 • Frederik Hoppe, Claudio Mayrink Verdun, Felix Krahmer, Marion I. Menzel, Holger Rauhut
In this paper, we present a method to improve the debiased estimator by sampling without replacement.
no code implementations • 18 Jun 2024 • Felix Krahmer, Anna Veselovska
A second class of methods that we discuss in detail is the class of error diffusion schemes, arguably among the most popular halftoning techniques due to their ability to work directly on a pixel grid and their ease of application.
no code implementations • 14 Sep 2023 • Frederik Hoppe, Claudio Mayrink Verdun, Felix Krahmer, Hannah Laus, Holger Rauhut
Model-based deep learning solutions to inverse problems have attracted increasing attention in recent years as they bridge state-of-the-art numerical performance with interpretability.
no code implementations • 5 Aug 2023 • Stefan Bamberger, Reinhard Heckel, Felix Krahmer
Furthermore, we also consider the approximation of general positive homogeneous functions with neural networks.
no code implementations • 17 Mar 2023 • Julia Kostin, Felix Krahmer, Dominik Stöger
Reformulation of blind deconvolution as a low-rank recovery problem has led to multiple theoretical recovery guarantees in the past decade due to the success of the nuclear norm minimization heuristic.
no code implementations • 30 Dec 2022 • Maternus Herold, Anna Veselovska, Jonas Jehle, Felix Krahmer
Efficient surrogate modelling is a key requirement for uncertainty quantification in data-driven scenarios.
Supervised dimensionality reduction Uncertainty Quantification
no code implementations • 10 May 2021 • Matthias Beckmann, Ayush Bhandari, Felix Krahmer
Taking a computational imaging approach to the HDR tomography problem, we here suggest a new model based on the Modulo Radon Transform (MRT), which we rigorously introduce and analyze.
no code implementations • 28 Feb 2019 • Felix Krahmer, Dominik Stöger
We find that for both these applications the dimension factors in the noise bounds are not an artifact of the proof, but the problems are intrinsically badly conditioned.
no code implementations • 17 Jul 2018 • Mark A. Iwen, Felix Krahmer, Sara Krause-Solberg, Johannes Maly
This paper studies the problem of recovering a signal from one-bit compressed sensing measurements under a manifold model; that is, assuming that the signal lies on or near a manifold of low intrinsic dimension.
Information Theory Information Theory
no code implementations • 21 Jun 2018 • Michael Moeller, Otmar Loffeld, Juergen Gall, Felix Krahmer
The idea of compressed sensing is to exploit representations in suitable (overcomplete) dictionaries that allow to recover signals far beyond the Nyquist rate provided that they admit a sparse representation in the respective dictionary.
no code implementations • 8 Oct 2012 • Felix Krahmer, Rachel Ward
For Fourier measurements and Haar wavelet sparsity, the local coherence can be controlled and bounded explicitly, so for matrices comprised of frequencies sampled from a suitable inverse square power-law density, we can prove the restricted isometry property with near-optimal embedding dimensions.