Search Results for author: Johannes Maly

Found 10 papers, 1 papers with code

Recovering Simultaneously Structured Data via Non-Convex Iteratively Reweighted Least Squares

1 code implementation NeurIPS 2023 Christian Kümmerle, Johannes Maly

We prove locally quadratic convergence of the iterates to a simultaneously structured data matrix in a regime of minimal sample complexity (up to constants and a logarithmic factor), which is known to be impossible for a combination of convex surrogates.

A simple approach for quantizing neural networks

no code implementations7 Sep 2022 Johannes Maly, Rayan Saab

In this short note, we propose a new method for quantizing the weights of a fully trained neural network.

Quantization

More is Less: Inducing Sparsity via Overparameterization

no code implementations21 Dec 2021 Hung-Hsu Chou, Johannes Maly, Holger Rauhut

In deep learning it is common to overparameterize neural networks, that is, to use more parameters than training samples.

Compressive Sensing

Robust Sensing of Low-Rank Matrices with Non-Orthogonal Sparse Decomposition

no code implementations9 Mar 2021 Johannes Maly

We consider the problem of recovering an unknown low-rank matrix X with (possibly) non-orthogonal, effectively sparse rank-1 decomposition from measurements y gathered in a linear measurement process A.

Information Theory Information Theory

Gradient Descent for Deep Matrix Factorization: Dynamics and Implicit Bias towards Low Rank

no code implementations27 Nov 2020 Hung-Hsu Chou, Carsten Gieshoff, Johannes Maly, Holger Rauhut

This suggests that deep learning prefers trajectories whose complexity (measuredin terms of effective rank) is monotonically increasing, which we believe is a fundamental concept for thetheoretical understanding of deep learning.

Denoising

Quantized Compressed Sensing by Rectified Linear Units

no code implementations18 Nov 2019 Hans Christian Jung, Johannes Maly, Lars Palzer, Alexander Stollenwerk

This work is concerned with the problem of recovering high-dimensional signals $\mathbf{x} \in \mathbb{R}^n$ which belong to a convex set of low-complexity from a small number of quantized measurements.

Information Theory Information Theory Probability 62B10 G.3

Computational approaches to non-convex, sparsity-inducing multi-penalty regularization

no code implementations7 Aug 2019 Zeljko Kereta, Johannes Maly, Valeriya Naumova

In this work we consider numerical efficiency and convergence rates for solvers of non-convex multi-penalty formulations when reconstructing sparse signals from noisy linear measurements.

Information Theory Information Theory

On Recovery Guarantees for One-Bit Compressed Sensing on Manifolds

no code implementations17 Jul 2018 Mark A. Iwen, Felix Krahmer, Sara Krause-Solberg, Johannes Maly

This paper studies the problem of recovering a signal from one-bit compressed sensing measurements under a manifold model; that is, assuming that the signal lies on or near a manifold of low intrinsic dimension.

Information Theory Information Theory

Analysis of Hard-Thresholding for Distributed Compressed Sensing with One-Bit Measurements

no code implementations9 May 2018 Johannes Maly, Lars Palzer

A simple hard-thresholding operation is shown to be able to recover $L$ signals $\mathbf{x}_1,...,\mathbf{x}_L \in \mathbb{R}^n$ that share a common support of size $s$ from $m = \mathcal{O}(s)$ one-bit measurements per signal if $L \ge \log(en/s)$.

Information Theory Information Theory

Robust Recovery of Low-Rank Matrices with Non-Orthogonal Sparse Decomposition from Incomplete Measurements

no code implementations18 Jan 2018 Massimo Fornasier, Johannes Maly, Valeriya Naumova

By adapting the concept of restricted isometry property from compressed sensing to our novel model class, we prove error bounds between global minimizers and ground truth, up to noise level, from a number of subgaussian measurements scaling as $R(s_1+s_2)$, up to log-factors in the dimension, and relative-to-diameter distortion.

Numerical Analysis Numerical Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.