1 code implementation • NeurIPS 2023 • Christian Kümmerle, Johannes Maly
We prove locally quadratic convergence of the iterates to a simultaneously structured data matrix in a regime of minimal sample complexity (up to constants and a logarithmic factor), which is known to be impossible for a combination of convex surrogates.
no code implementations • 7 Sep 2022 • Johannes Maly, Rayan Saab
In this short note, we propose a new method for quantizing the weights of a fully trained neural network.
no code implementations • 21 Dec 2021 • Hung-Hsu Chou, Johannes Maly, Holger Rauhut
In deep learning it is common to overparameterize neural networks, that is, to use more parameters than training samples.
no code implementations • 9 Mar 2021 • Johannes Maly
We consider the problem of recovering an unknown low-rank matrix X with (possibly) non-orthogonal, effectively sparse rank-1 decomposition from measurements y gathered in a linear measurement process A.
Information Theory Information Theory
no code implementations • 27 Nov 2020 • Hung-Hsu Chou, Carsten Gieshoff, Johannes Maly, Holger Rauhut
This suggests that deep learning prefers trajectories whose complexity (measuredin terms of effective rank) is monotonically increasing, which we believe is a fundamental concept for thetheoretical understanding of deep learning.
no code implementations • 18 Nov 2019 • Hans Christian Jung, Johannes Maly, Lars Palzer, Alexander Stollenwerk
This work is concerned with the problem of recovering high-dimensional signals $\mathbf{x} \in \mathbb{R}^n$ which belong to a convex set of low-complexity from a small number of quantized measurements.
Information Theory Information Theory Probability 62B10 G.3
no code implementations • 7 Aug 2019 • Zeljko Kereta, Johannes Maly, Valeriya Naumova
In this work we consider numerical efficiency and convergence rates for solvers of non-convex multi-penalty formulations when reconstructing sparse signals from noisy linear measurements.
Information Theory Information Theory
no code implementations • 17 Jul 2018 • Mark A. Iwen, Felix Krahmer, Sara Krause-Solberg, Johannes Maly
This paper studies the problem of recovering a signal from one-bit compressed sensing measurements under a manifold model; that is, assuming that the signal lies on or near a manifold of low intrinsic dimension.
Information Theory Information Theory
no code implementations • 9 May 2018 • Johannes Maly, Lars Palzer
A simple hard-thresholding operation is shown to be able to recover $L$ signals $\mathbf{x}_1,...,\mathbf{x}_L \in \mathbb{R}^n$ that share a common support of size $s$ from $m = \mathcal{O}(s)$ one-bit measurements per signal if $L \ge \log(en/s)$.
Information Theory Information Theory
no code implementations • 18 Jan 2018 • Massimo Fornasier, Johannes Maly, Valeriya Naumova
By adapting the concept of restricted isometry property from compressed sensing to our novel model class, we prove error bounds between global minimizers and ground truth, up to noise level, from a number of subgaussian measurements scaling as $R(s_1+s_2)$, up to log-factors in the dimension, and relative-to-diameter distortion.
Numerical Analysis Numerical Analysis