Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method

7 Sep 2020  ·  Christian Kümmerle, Claudio M. Verdun ·

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective. It combines the favorable data efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. Our method attains a local quadratic convergence rate already for a number of samples that is close to the information theoretical limit. We show in numerical experiments that unlike many state-of-the-art approaches, our approach is able to complete very ill-conditioned matrices with a condition number of up to $10^{10}$ from few samples.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here