no code implementations • 25 Sep 2022 • Vinith M. Suriyakumar, Ashia C. Wilson
We study the problem of deleting user data from machine learning models trained using empirical risk minimization.
3 code implementations • NeurIPS 2017 • Ashia C. Wilson, Rebecca Roelofs, Mitchell Stern, Nathan Srebro, Benjamin Recht
Adaptive optimization methods, which perform local optimization with a metric constructed from the history of iterates, are becoming increasingly popular for training deep neural networks.
no code implementations • 14 Mar 2016 • Andre Wibisono, Ashia C. Wilson, Michael. I. Jordan
We show that there is a Lagrangian functional that we call the \emph{Bregman Lagrangian} which generates a large class of accelerated methods in continuous time, including (but not limited to) accelerated gradient descent, its non-Euclidean extension, and accelerated higher-order gradient methods.
2 code implementations • NeurIPS 2013 • Tamara Broderick, Nicholas Boyd, Andre Wibisono, Ashia C. Wilson, Michael. I. Jordan
We present SDA-Bayes, a framework for (S)treaming, (D)istributed, (A)synchronous computation of a Bayesian posterior.