Robust spectral compressive sensing via vanilla gradient descent

21 Jan 2021  ·  Xunmeng Wu, Zai Yang, Zongben Xu ·

This paper investigates the recovery of a spectrally sparse signal from its partially revealed noisy entries within the framework of spectral compressive sensing. Nonconvex optimization approaches have recently been proposed based on low-rank Hankel matrix completion and projected gradient descent (PGD). The PGD however involves unknown tuning parameters and its theoretical analysis is available only in the absence of noise. In this paper, we propose a hyperparameter-free, vanilla gradient descent (VGD) algorithm and prove that the VGD enables robust recovery of an $N$-dimensional $K$-spectrally-sparse signal from order $K^2 log^2N$ number of noisy samples under coherence and other mild conditions. The above sample complexity increases by factor $logN$ as compared with PGD without noise. Numerical simulations are provided that corroborate our analysis and show advantageous performances of VGD.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Information Theory Information Theory

Datasets


  Add Datasets introduced or used in this paper