In recent years, stochastic variance reduction algorithms have attracted considerable attention for minimizing the average of a large but finite number of loss functions.
The proposed low gradient regularization is integrated with the low rank regularization into the low rank low gradient approach for depth image inpainting.
Low rank matrix completion plays a fundamental role in collaborative filtering applications, the key idea being that the variables lie in a smaller subspace than the ambient space.
SOTA for Recommendation Systems on Flixster (using extra training data)
Compared to the max norm and the factored formulation of the nuclear norm, factor group-sparse regularizers are more efficient, accurate, and robust to the initial guess of rank.
In this work, we show that a simple modification of our robust ST solution also provably solves ST-miss and robust ST-miss.
We consider the problem of reconstructing a low-rank matrix from a small subset of its entries.