Search Results for author: Christian Kümmerle

Found 8 papers, 5 papers with code

Recovering Simultaneously Structured Data via Non-Convex Iteratively Reweighted Least Squares

1 code implementation NeurIPS 2023 Christian Kümmerle, Johannes Maly

We prove locally quadratic convergence of the iterates to a simultaneously structured data matrix in a regime of minimal sample complexity (up to constants and a logarithmic factor), which is known to be impossible for a combination of convex surrogates.

On the Convergence of IRLS and Its Variants in Outlier-Robust Estimation

1 code implementation CVPR 2023 Liangzu Peng, Christian Kümmerle, René Vidal

Outlier-robust estimation involves estimating some parameters (e. g., 3D rotations) from data samples in the presence of outliers, and is typically formulated as a non-convex and non-smooth problem.

Learning Transition Operators From Sparse Space-Time Samples

no code implementations1 Dec 2022 Christian Kümmerle, Mauro Maggioni, Sui Tang

This Spatio-Temporal Transition Operator Recovery problem is motivated by the recent interest in learning time-varying graph signals that are driven by graph operators depending on the underlying graph topology.

Low-Rank Matrix Completion

A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

1 code implementation3 Jun 2021 Christian Kümmerle, Claudio Mayrink Verdun

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate.

Low-Rank Matrix Completion

Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method

1 code implementation7 Sep 2020 Christian Kümmerle, Claudio M. Verdun

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective.

Low-Rank Matrix Completion

Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery

1 code implementation15 Mar 2017 Christian Kümmerle, Juliane Sigl

We propose a new iteratively reweighted least squares (IRLS) algorithm for the recovery of a matrix $X \in \mathbb{C}^{d_1\times d_2}$ of rank $r \ll\min(d_1, d_2)$ from incomplete linear observations, solving a sequence of low complexity linear problems.

Numerical Analysis Information Theory Information Theory Optimization and Control

Cannot find the paper you are looking for? You can Submit a new open access paper.