Momentum Stochastic Gradient Descent (MSGD) algorithm has been widely applied to many nonconvex optimization problems in machine learning, e.g., training deep neural networks, variational Bayesian inference, and etc. Due to current technical limit, however, establishing convergence properties of MSGD for these highly complicated nonconvex problems is generally infeasible... (read more)
PDFMETHOD | TYPE | |
---|---|---|
![]() |
Dimensionality Reduction |