no code implementations • NeurIPS 2021 • Emmanuel Abbe, Enric Boix-Adsera, Matthew Brennan, Guy Bresler, Dheeraj Nagaraj
This paper identifies a structural property of data distributions that enables deep neural networks to learn hierarchically.
no code implementations • 13 Sep 2020 • Matthew Brennan, Guy Bresler, Samuel B. Hopkins, Jerry Li, Tselil Schramm
Researchers currently use a number of approaches to predict and substantiate information-computation gaps in high-dimensional statistical estimation problems.
no code implementations • 16 May 2020 • Matthew Brennan, Guy Bresler
Inference problems with conjectured statistical-computational gaps are ubiquitous throughout modern statistics, computer science and statistical physics.
no code implementations • 8 Aug 2019 • Matthew Brennan, Guy Bresler
This paper develops several average-case reduction techniques to show new hardness results for three central high-dimensional statistics problems, implying a statistical-computational gap induced by robustness, a detection-recovery gap and a universality principle for these gaps.
no code implementations • 20 Feb 2019 • Matthew Brennan, Guy Bresler
We also show the surprising result that weaker forms of the PC conjecture up to clique size $K = o(N^\alpha)$ for any given $\alpha \in (0, 1/2]$ imply tight computational lower bounds for sparse PCA at sparsities $k = o(n^{\alpha/3})$.
no code implementations • 19 Feb 2019 • Matthew Brennan, Guy Bresler, Wasim Huleihel
In the general submatrix detection problem, the task is to detect the presence of a small $k \times k$ submatrix with entries sampled from a distribution $\mathcal{P}$ in an $n \times n$ matrix of samples from $\mathcal{Q}$.