Improved SVRG for quadratic functions

1 Jun 2020  ·  Nabil Kahale ·

We analyse an iterative algorithm to minimize quadratic functions whose Hessian matrix $H$ is the expectation of a random symmetric $d\times d$ matrix. The algorithm is a variant of the stochastic variance reduced gradient (SVRG). In several applications, including least-squares regressions, ridge regressions, linear discriminant analysis and regularized linear discriminant analysis, the running time of each iteration is proportional to $d$. Under smoothness and convexity conditions, the algorithm has linear convergence. When applied to quadratic functions, our analysis improves the state-of-the-art performance of SVRG up to a logarithmic factor. Furthermore, for well-conditioned quadratic problems, our analysis improves the state-of-the-art running times of accelerated SVRG, and is better than the known matching lower bound, by a logarithmic factor. Our theoretical results are backed with numerical experiments.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here