A New Variant of Stochastic Heavy ball Optimization Method for Deep Learning

1 Jan 2021  ·  Zhou Shao, Tong Lin ·

Stochastic momentum optimization methods, also known as stochastic heavy ball (SHB) methods, are one of the most popular optimization methods for deep learning. These methods can help accelerate stochastic gradient descent and dampen oscillations. In this paper we provide a new variant of the stochastic heavy ball method, called stochastic Euler’s heavy ball (SEHB). The proposed SEHB method modifies the steepest descent direction to achieve acceleration, and combines Euler‘s method to adaptively adjust learning rates as well. A convergence analysis of the regret bound is discussed under the online convex optimization framework. Furthermore, we conduct experiments on various popular datasets and deep learning models. Empirical results demonstrate that our SEHB method shows comparable or even better generalization performance than state-of-the-art optimization methods such as SGD and Adam.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods