Search Results for author: Michael Friedlander

Found 3 papers, 0 papers with code

Fast convergence of stochastic subgradient method under interpolation

no code implementations ICLR 2021 Huang Fang, Zhenan Fan, Michael Friedlander

We prove that SSGD converges, respectively, with rates $O(1/\epsilon)$ and $O(\log(1/\epsilon))$ for convex and strongly-convex objectives when interpolation holds.

Satisfying Real-world Goals with Dataset Constraints

no code implementations NeurIPS 2016 Gabriel Goh, Andrew Cotter, Maya Gupta, Michael Friedlander

The goal of minimizing misclassification error on a training set is often just one of several real-world goals that might be defined on different datasets.

Fairness

Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection

no code implementations1 Jun 2015 Julie Nutini, Mark Schmidt, Issam H. Laradji, Michael Friedlander, Hoyt Koepke

There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of Nesterov [SIAM J.

Cannot find the paper you are looking for? You can Submit a new open access paper.