Fast Rates for Regularized Objectives

We show that the empirical minimizer of a stochastic strongly convex objective, where the stochastic component is linear, converges to the population minimizer with rate $O(1/n)$. The result applies, in particular, to the SVM objective. Thus, we get a rate of $O(1/n)$ on the convergence of the SVM objective to its infinite data limit. We demonstrate how this is essential for obtaining tight oracle inequalities for SVMs. The results extend also to strong convexity with respect to other $\ellnorm_p$ norms, and so also to objectives regularized using other norms.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods