Paper

A Diffusion Approximation Theory of Momentum SGD in Nonconvex Optimization

Momentum Stochastic Gradient Descent (MSGD) algorithm has been widely applied to many nonconvex optimization problems in machine learning, e.g., training deep neural networks, variational Bayesian inference, and etc. Despite its empirical success, there is still a lack of theoretical understanding of convergence properties of MSGD... (read more)

Results in Papers With Code
(↓ scroll down to see all results)