Low-Rank Matrix Completion

25 papers with code • 0 benchmarks • 0 datasets

Low-Rank Matrix Completion is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix.

Source: Universal Matrix Completion

Most implemented papers

Guaranteed Tensor Recovery Fused Low-rankness and Smoothness

wanghailin97/Guaranteed-Tensor-Recovery-Fused-Low-rankness-and-Smoothness 4 Feb 2023

Recent research have made significant progress by adopting two insightful tensor priors, i. e., global low-rankness (L) and local smoothness (S) across different tensor modes, which are always encoded as a sum of two separate regularization terms into the recovery models.

Teaching Arithmetic to Small Transformers

lee-ny/teaching_arithmetic 7 Jul 2023

Even in the complete absence of pretraining, this approach significantly and simultaneously improves accuracy, sample complexity, and convergence speed.

Efficient Compression of Overparameterized Deep Models through Low-Dimensional Learning Dynamics

soominkwon/comp-deep-nets 8 Nov 2023

We empirically evaluate the effectiveness of our compression technique on matrix recovery problems.

Linear Recursive Feature Machines provably recover low-rank matrices

aradha/lin-rfm 9 Jan 2024

A possible explanation is that common training algorithms for neural networks implicitly perform dimensionality reduction - a process called feature learning.

Matrix Completion with Convex Optimization and Column Subset Selection

ZAL-NASK/CSMC 4 Mar 2024

We present two algorithms that implement our Columns Selected Matrix Completion (CSMC) method, each dedicated to a different size problem.