Tackling small eigen-gaps: Fine-grained eigenvector estimation and inference under heteroscedastic noise

14 Jan 2020  ·  Chen Cheng, Yuting Wei, Yuxin Chen ·

This paper aims to address two fundamental challenges arising in eigenvector estimation and inference for a low-rank matrix from noisy observations: (1) how to estimate an unknown eigenvector when the eigen-gap (i.e. the spacing between the associated eigenvalue and the rest of the spectrum) is particularly small; (2) how to perform estimation and inference on linear functionals of an eigenvector -- a sort of "fine-grained" statistical reasoning that goes far beyond the usual $\ell_2$ analysis. We investigate how to address these challenges in a setting where the unknown $n\times n$ matrix is symmetric and the additive noise matrix contains independent (and non-symmetric) entries. Based on eigen-decomposition of the asymmetric data matrix, we propose estimation and uncertainty quantification procedures for an unknown eigenvector, which further allow us to reason about linear functionals of an unknown eigenvector. The proposed procedures and the accompanying theory enjoy several important features: (1) distribution-free (i.e. prior knowledge about the noise distributions is not needed); (2) adaptive to heteroscedastic noise; (3) minimax optimal under Gaussian noise. Along the way, we establish optimal procedures to construct confidence intervals for the unknown eigenvalues. All this is guaranteed even in the presence of a small eigen-gap (up to $O(\sqrt{n/\mathrm{poly}\log (n)})$ times smaller than the requirement in prior theory), which goes significantly beyond what generic matrix perturbation theory has to offer.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here