no code implementations • 11 Jun 2015 • Tim Conrad, Martin Genzel, Nada Cvetkovic, Niklas Wulkow, Alexander Leichtle, Jan Vybiral, Gitta Kutyniok, Christof Schütte
Results: We present a new algorithm, Sparse Proteomics Analysis (SPA), based on the theory of compressed sensing that allows us to identify a minimal discriminating set of features from mass spectrometry data-sets.
no code implementations • 31 Aug 2016 • Martin Genzel, Gitta Kutyniok
In this paper, we study the challenge of feature selection based on a relatively small collection of sample pairs $\{(x_i, y_i)\}_{1 \leq i \leq m}$.
no code implementations • 20 Aug 2018 • Martin Genzel, Gitta Kutyniok
We study the estimation capacity of the generalized Lasso, i. e., least squares minimization combined with a (convex) structural constraint.
no code implementations • 11 Apr 2020 • Martin Genzel, Christian Kipp
This work performs a non-asymptotic analysis of the generalized Lasso under the assumption of sub-exponential data.
no code implementations • 19 Sep 2020 • Martin Genzel, Alexander Stollenwerk
An important characteristic of associated guarantees is uniformity, i. e., recovery succeeds for an entire class of structured signals with a fixed measurement ensemble.
Information Theory Information Theory Statistics Theory Statistics Theory
1 code implementation • 9 Nov 2020 • Martin Genzel, Jan Macdonald, Maximilian März
In the past five years, deep learning methods have become state-of-the-art in solving various inverse problems.
1 code implementation • 1 Jun 2021 • Martin Genzel, Jan Macdonald, Maximilian März
This report is dedicated to a short motivation and description of our contribution to the AAPM DL-Sparse-View CT Challenge (team name: "robust-and-stable").
no code implementations • NeurIPS 2023 • Sjoerd Dirksen, Martin Genzel, Laurent Jacques, Alexander Stollenwerk
Neural networks with random weights appear in a variety of machine learning applications, most prominently as the initialization of many deep learning algorithms and as a computationally cheap alternative to fully learned neural networks.
no code implementations • NeurIPS Workshop Deep_Invers 2021 • Martin Genzel, Ingo Gühring, Jan Macdonald, Maximilian März
This work presents an empirical study on the design and training of iterative neural networks for image reconstruction from tomographic measurements with unknown geometry.
no code implementations • NeurIPS Workshop Deep_Invers 2021 • Jonathan Sauder, Martin Genzel, Peter Jung
Countless signal processing applications include the reconstruction of an unknown signal from very few indirect linear measurements.
1 code implementation • 7 Feb 2022 • Jonathan Sauder, Martin Genzel, Peter Jung
Countless signal processing applications include the reconstruction of signals from few indirect linear measurements.
1 code implementation • 14 Jun 2022 • Martin Genzel, Ingo Gühring, Jan Macdonald, Maximilian März
This work is concerned with the following fundamental question in scientific machine learning: Can deep-learning-based methods solve noise-free inverse problems to near-perfect accuracy?
1 code implementation • 18 Nov 2022 • Theophil Trippe, Martin Genzel, Jan Macdonald, Maximilian März
This work presents a novel deep-learning-based pipeline for the inverse problem of image deblurring, leveraging augmentation and pre-training with synthetic data.
1 code implementation • NeurIPS 2023 • Julien Siems, Konstantin Ditschuneit, Winfried Ripken, Alma Lindborg, Maximilian Schambach, Johannes S. Otterbach, Martin Genzel
Generalized Additive Models (GAMs) have recently experienced a resurgence in popularity due to their interpretability, which arises from expressing the target value as a sum of non-linear transformations of the features.
1 code implementation • 30 Sep 2023 • Sjoerd Dirksen, Patrick Finke, Martin Genzel
In practice, deep neural networks are often able to easily interpolate their training data.
no code implementations • 19 Nov 2023 • Felix Pieper, Konstantin Ditschuneit, Martin Genzel, Alexandra Lindt, Johannes Otterbach
Self-supervised learning for time-series data holds potential similar to that recently unleashed in Natural Language Processing and Computer Vision.