Variance Reduced Stochastic Gradient Descent with Neighbors

NeurIPS 2015 Thomas HofmannAurelien LucchiSimon Lacoste-JulienBrian McWilliams

Stochastic Gradient Descent (SGD) is a workhorse in machine learning, yet its slow convergence can be a computational bottleneck. Variance reduction techniques such as SAG, SVRG and SAGA have been proposed to overcome this weakness, achieving linear convergence... (read more)

PDF Abstract NeurIPS 2015 PDF NeurIPS 2015 Abstract

Code


No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper