no code implementations • 9 Feb 2024 • Jianhao Ma, Salar Fattahi
In the over-parameterized regime where $r'\geq r$, we show that, with $\widetilde\Omega(dr^9)$ observations, GD with an initial point $\|\rm{U}_0\| \leq \epsilon$ converges near-linearly to an $\epsilon$-neighborhood of $\rm{X}^\star$.
1 code implementation • 24 May 2023 • Jianhao Ma, Rui Ray Chen, Yinghui He, Salar Fattahi, Wei Hu
This paper presents a simple mean estimator that overcomes both challenges under moderate conditions: it runs in near-linear time and memory (both with respect to the ambient dimension) while requiring only $\tilde O(k)$ samples to recover the true mean.
no code implementations • 21 Feb 2023 • Jianhao Ma, Salar Fattahi
In matrix completion, even with slight rank overestimation and mild noise, true solutions either emerge as non-critical or strict saddle points.
1 code implementation • 1 Oct 2022 • Jianhao Ma, Lingjun Guo, Salar Fattahi
This work analyzes the solution trajectory of gradient-based algorithms via a novel basis function decomposition.
no code implementations • 15 Jul 2022 • Jianhao Ma, Salar Fattahi
This work characterizes the effect of depth on the optimization landscape of linear regression, showing that, despite their nonconvexity, deeper models have more desirable optimization landscape.
no code implementations • 17 Feb 2022 • Jianhao Ma, Salar Fattahi
We prove that a simple SubGM with small initialization is agnostic to both over-parameterization and noise in the measurements.
no code implementations • ICLR 2022 • Jiaye Teng, Jianhao Ma, Yang Yuan
Generalization is one of the fundamental issues in machine learning.
no code implementations • 5 Feb 2021 • Jianhao Ma, Salar Fattahi
Restricted isometry property (RIP), essentially stating that the linear measurements are approximately norm-preserving, plays a crucial role in studying low-rank matrix recovery problem.