Nostalgic Adam: Weighting more of the past gradients when designing the adaptive learning rate

19 May 2018Haiwen HuangChang WangBin Dong

First-order optimization algorithms have been proven prominent in deep learning. In particular, algorithms such as RMSProp and Adam are extremely popular... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper