Low-Rank Matrix Completion
25 papers with code • 0 benchmarks • 0 datasets
Low-Rank Matrix Completion is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix.
Source: Universal Matrix Completion
Benchmarks
These leaderboards are used to track progress in Low-Rank Matrix Completion
Most implemented papers
Guaranteed Tensor Recovery Fused Low-rankness and Smoothness
Recent research have made significant progress by adopting two insightful tensor priors, i. e., global low-rankness (L) and local smoothness (S) across different tensor modes, which are always encoded as a sum of two separate regularization terms into the recovery models.
Teaching Arithmetic to Small Transformers
Even in the complete absence of pretraining, this approach significantly and simultaneously improves accuracy, sample complexity, and convergence speed.
Efficient Compression of Overparameterized Deep Models through Low-Dimensional Learning Dynamics
We empirically evaluate the effectiveness of our compression technique on matrix recovery problems.
Linear Recursive Feature Machines provably recover low-rank matrices
A possible explanation is that common training algorithms for neural networks implicitly perform dimensionality reduction - a process called feature learning.
Matrix Completion with Convex Optimization and Column Subset Selection
We present two algorithms that implement our Columns Selected Matrix Completion (CSMC) method, each dedicated to a different size problem.