Low-Rank Matrix Completion
22 papers with code • 0 benchmarks • 0 datasets
Low-Rank Matrix Completion is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix.
Source: Universal Matrix Completion
Benchmarks
These leaderboards are used to track progress in Low-Rank Matrix Completion
Most implemented papers
Collaborative Filtering with Graph Information: Consistency and Scalable Methods
Low rank matrix completion plays a fundamental role in collaborative filtering applications, the key idea being that the variables lie in a smaller subspace than the ambient space.
Optimal Low-Rank Matrix Completion: Semidefinite Relaxations and Eigenvector Disjunctions
Low-rank matrix completion consists of computing a matrix of minimal complexity that recovers a given set of observations as accurately as possible, and has numerous applications such as product recommendation.
Guaranteed Rank Minimization via Singular Value Projection
Minimizing the rank of a matrix subject to affine constraints is a fundamental problem with many important applications in machine learning and statistics.
A Gradient Descent Algorithm on the Grassman Manifold for Matrix Completion
We consider the problem of reconstructing a low-rank matrix from a small subset of its entries.
Orthogonal Rank-One Matrix Pursuit for Low Rank Matrix Completion
Numerical results show that our proposed algorithm is more efficient than competing algorithms while achieving similar or better prediction performance.
Depth Image Inpainting: Improving Low Rank Matrix Completion with Low Gradient Regularization
The proposed low gradient regularization is integrated with the low rank regularization into the low rank low gradient approach for depth image inpainting.
Riemannian stochastic variance reduced gradient on Grassmann manifold
In this paper, we propose a novel Riemannian extension of the Euclidean stochastic variance reduced gradient algorithm (R-SVRG) to a compact manifold search space.
Riemannian stochastic variance reduced gradient algorithm with retraction and vector transport
In recent years, stochastic variance reduction algorithms have attracted considerable attention for minimizing the average of a large but finite number of loss functions.
Algebraic Variety Models for High-Rank Matrix Completion
We consider a generalization of low-rank matrix completion to the case where the data belongs to an algebraic variety, i. e. each data point is a solution to a system of polynomial equations.