no code implementations • 22 Mar 2022 • Souvik Dhara, Julia Gaudio, Elchanan Mossel, Colin Sandon
Spectral algorithms are an important building block in machine learning and graph algorithms.
no code implementations • NeurIPS 2021 • Emmanuel Abbe, Pritish Kamath, Eran Malach, Colin Sandon, Nathan Srebro
With fine enough precision relative to minibatch size, namely when $b \rho$ is small enough, SGD can go beyond SQ learning and simulate any sample-based learning algorithm and thus its learning power is equivalent to that of PAC learning; this extends prior work that achieved this result for $b=1$.
no code implementations • 15 Jun 2021 • Ankur Moitra, Elchanan Mossel, Colin Sandon
In this work, we study the computational complexity of determining whether a machine learning model that perfectly fits the training data will generalizes to unseen data.
no code implementations • 15 Jan 2021 • Ankur Moitra, Elchanan Mossel, Colin Sandon
These are Markov Random Fields where some of the nodes are censored (not observed).
no code implementations • NeurIPS 2020 • Emmanuel Abbe, Colin Sandon
This paper shows that deep learning, i. e., neural networks trained by SGD, can learn in polytime any function class that can be learned in polytime by some algorithm, including parities.
no code implementations • 7 Jan 2020 • Emmanuel Abbe, Colin Sandon
Therefore deep learning provides a universal learning paradigm: it was known that the approximation and estimation errors could be controlled with poly-size neural nets, using ERM that is NP-hard; this new result shows that the optimization error can also be controlled with SGD in poly-time.
no code implementations • 16 Dec 2018 • Emmanuel Abbe, Colin Sandon
As the success of deep learning reaches more grounds, one would like to also envision the potential limits of deep learning.
no code implementations • NeurIPS 2016 • Emmanuel Abbe, Colin Sandon
The stochastic block model (SBM) has long been studied in machine learning and network science as a canonical model for clustering and community detection.
no code implementations • 30 Dec 2015 • Emmanuel Abbe, Colin Sandon
In a paper that initiated the modern study of the stochastic block model, Decelle et al., backed by Mossel et al., made the following conjecture: Denote by $k$ the number of balanced communities, $a/n$ the probability of connecting inside communities and $b/n$ across, and set $\mathrm{SNR}=(a-b)^2/(k(a+(k-1)b)$; for any $k \geq 2$, it is possible to detect communities efficiently whenever $\mathrm{SNR}>1$ (the KS threshold), whereas for $k\geq 4$, it is possible to detect communities information-theoretically for some $\mathrm{SNR}<1$.
no code implementations • NeurIPS 2015 • Emmanuel Abbe, Colin Sandon
Most recent developments on the stochastic block model (SBM) rely on the knowledge of the model parameters, or at least on the number of communities.