A Parallel SGD method with Strong Convergence

4 Nov 2013  ·  Dhruv Mahajan, S. Sathiya Keerthi, S. Sundararajan, Leon Bottou ·

This paper proposes a novel parallel stochastic gradient descent (SGD) method that is obtained by applying parallel sets of SGD iterations (each set operating on one node using the data residing in it) for finding the direction in each iteration of a batch descent method. The method has strong convergence properties... Experiments on datasets with high dimensional feature spaces show the value of this method. read more

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods