no code implementations • 16 Jun 2022 • Margalit Glasgow, Colin Wei, Mary Wootters, Tengyu Ma
Nagarajan and Kolter (2019) show that in certain simple linear and neural-network settings, any uniform convergence bound will be vacuous, leaving open the question of how to prove generalization in settings where UC fails.
no code implementations • 22 Sep 2020 • Margalit Glasgow, Mary Wootters
This complexity sits squarely between the complexity $\tilde{O}\left(\left(n + \kappa\right)\log(1/\epsilon)\right)$ of SAGA \textit{without delays} and the complexity $\tilde{O}\left(\left(n + m\kappa\right)\log(1/\epsilon)\right)$ of parallel asynchronous algorithms where the delays are \textit{arbitrary} (but bounded by $O(m)$), and the data is accessible by all.
no code implementations • 17 Jun 2020 • Margalit Glasgow, Mary Wootters
Recent work has studied approximate gradient coding, which concerns coding schemes where the replication factor of the data is too low to recover the full gradient exactly.
2 code implementations • 23 Jun 2015 • Moritz Hardt, Nimrod Megiddo, Christos Papadimitriou, Mary Wootters
Jury designs a classifier, and Contestant receives an input to the classifier, which he may change at some cost.
no code implementations • 15 Jul 2014 • Moritz Hardt, Mary Wootters
We give the first algorithm for Matrix Completion whose running time and sample complexity is polynomial in the rank of the unknown target matrix, linear in the dimension of the matrix, and logarithmic in the condition number of the matrix.