Perturbation Analysis of Randomized SVD and its Applications to Statistics

19 Mar 2022  ·  Yichi Zhang, Minh Tang ·

Randomized singular value decomposition (RSVD) is a class of computationally efficient algorithms for computing the truncated SVD of large data matrices. Given an $m \times n$ matrix $\widehat{{\mathbf M}}$, the prototypical RSVD algorithm outputs an approximation of the $k$ leading left singular vectors of $\widehat{\mathbf{M}}$ by computing the SVD of $\widehat{\mathbf{M}} (\widehat{{\mathbf M}}^{\top} \widehat{\mathbf{M}})^{g} \mathbf G$; here $g \geq 1$ is an integer and $\mathbf G \in \mathbb{R}^{n \times \widetilde{k}}$ is a random Gaussian sketching matrix with $\widetilde{k} \geq k$. In this paper we derive upper bounds for the $\ell_2$ and $\ell_{2,\infty}$ distances between the exact left singular vectors $\widehat{\mathbf{U}}$ of $\widehat{\mathbf{M}}$ and its approximation $\widehat{\mathbf{U}}_g$ (obtained via RSVD), as well as entrywise error bounds when $\widehat{\mathbf{M}}$ is projected onto $\widehat{\mathbf{U}}_g \widehat{\mathbf{U}}_g^{\top}$. These bounds depend on the singular values gap and number of power iterations $g$, and smaller gap requires larger values of $g$ to guarantee the convergences of the $\ell_2$ and $\ell_{2,\infty}$ distances. We apply our theoretical results to settings where $\widehat{\mathbf{M}}$ is an additive perturbation of some unobserved signal matrix $\mathbf{M}$. In particular, we obtain the nearly-optimal convergence rate and asymptotic normality for RSVD on three inference problems, namely, subspace estimation and community detection in random graphs, noisy matrix completion, and PCA with missing data.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here