Paper

Efficient Online Minimization for Low-Rank Subspace Clustering

Low-rank representation~(LRR) has been a significant method for segmenting data that are generated from a union of subspaces. It is, however, known that solving the LRR program is challenging in terms of time complexity and memory footprint, in that the size of the nuclear norm regularized matrix is $n$-by-$n$ (where $n$ is the number of samples). In this paper, we thereby develop a fast online implementation of LRR that reduces the memory cost from $O(n^2)$ to $O(pd)$, with $p$ being the ambient dimension and $d$ being some estimated rank~($d < p \ll n$). The crux for this end is a non-convex reformulation of the LRR program, which pursues the basis dictionary that generates the (uncorrupted) observations. We build the theoretical guarantee that the sequence of the solutions produced by our algorithm converges to a stationary point of the empirical and the expected loss function asymptotically. Extensive experiments on synthetic and realistic datasets further substantiate that our algorithm is fast, robust and memory efficient.

Results in Papers With Code
(↓ scroll down to see all results)