Paper

Laplacian Smoothing Gradient Descent

We propose a class of very simple modifications of gradient descent and stochastic gradient descent. We show that when applied to a large variety of machine learning problems, ranging from logistic regression to deep neural nets, the proposed surrogates can dramatically reduce the variance, allow to take a larger step size, and improve the generalization accuracy... (read more)

Results in Papers With Code
(↓ scroll down to see all results)