Low-Rank Matrix Completion
17 papers with code • 0 benchmarks • 0 datasets
Low-Rank Matrix Completion is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix.
Source: Universal Matrix Completion
These leaderboards are used to track progress in Low-Rank Matrix Completion
Low rank matrix completion plays a fundamental role in collaborative filtering applications, the key idea being that the variables lie in a smaller subspace than the ambient space.
Minimizing the rank of a matrix subject to affine constraints is a fundamental problem with many important applications in machine learning and statistics.
Numerical results show that our proposed algorithm is more efficient than competing algorithms while achieving similar or better prediction performance.
The proposed low gradient regularization is integrated with the low rank regularization into the low rank low gradient approach for depth image inpainting.
In this paper, we propose a novel Riemannian extension of the Euclidean stochastic variance reduced gradient algorithm (R-SVRG) to a compact manifold search space.
In recent years, stochastic variance reduction algorithms have attracted considerable attention for minimizing the average of a large but finite number of loss functions.
In this work, we show that a simple modification of our robust ST solution also provably solves ST-miss and robust ST-miss.
In this work, we show that the skewed distribution of ratings in the user-item rating matrix of real-world datasets affects the accuracy of matrix-completion-based approaches.