no code implementations • 18 Mar 2024 • Hongjie Chen, Jingqiu Ding, Tommaso d'Orsi, Yiding Hua, Chih-Hung Liu, David Steurer
We develop the first pure node-differentially-private algorithms for learning stochastic block models and for graphon estimation with polynomial running time for any constant number of blocks.
no code implementations • 21 Feb 2024 • Rares-Darius Buhai, Jingqiu Ding, Stefan Tiegel
In particular, we show that an improper learning algorithm for sparse linear regression can be used to solve sparse PCA problems (with a negative spike) in their Wishart form, in regimes in which efficient algorithms are widely believed to require at least $\Omega(k^2)$ samples.
no code implementations • 17 May 2023 • Jingqiu Ding, Tommaso d'Orsi, Yiding Hua, David Steurer
We study robust community detection in the context of node-corrupted stochastic block model, where an adversary can arbitrarily modify all the edges incident to a fraction of the $n$ vertices.
no code implementations • 26 Jan 2023 • Jingqiu Ding, Yiding Hua
A classical question is how to recover this planted vector given a random basis in this subspace.
no code implementations • 14 Feb 2022 • Jingqiu Ding, Tommaso d'Orsi, Chih-Hung Liu, Stefan Tiegel, David Steurer
We develop the first fast spectral algorithm to decompose a random third-order tensor over $\mathbb{R}^d$ of rank up to $O(d^{3/2}/\text{polylog}(d))$.
no code implementations • 16 Nov 2021 • Jingqiu Ding, Tommaso d'Orsi, Rajai Nasser, David Steurer
We develop an efficient algorithm for weak recovery in a robust version of the stochastic block model.
no code implementations • NeurIPS 2020 • Jingqiu Ding, Samuel B. Hopkins, David Steurer
For the case of Gaussian noise, the top eigenvector of the given matrix is a widely-studied estimator known to achieve optimal statistical guarantees, e. g., in the sense of the celebrated BBP phase transition.