AMSGrad is a stochastic optimization method that seeks to fix a convergence issue with Adam based optimizers. AMSGrad uses the maximum of past squared gradients $v_{t}$ rather than the exponential average to update the parameters:
$$m_{t} = \beta_{1}m_{t1} + \left(1\beta_{1}\right)g_{t} $$
$$v_{t} = \beta_{2}v_{t1} + \left(1\beta_{2}\right)g_{t}^{2}$$
$$ \hat{v}_{t} = \max\left(\hat{v}_{t1}, v_{t}\right) $$
$$\theta_{t+1} = \theta_{t}  \frac{\eta}{\sqrt{\hat{v}_{t}} + \epsilon}m_{t}$$
Source: On the Convergence of Adam and BeyondPaper  Code  Results  Date  Stars 

Task  Papers  Share 

Time Series  2  18.18% 
ClickThrough Rate Prediction  1  9.09% 
Quantization  1  9.09% 
Atari Games  1  9.09% 
Image Classification  1  9.09% 
Machine Translation  1  9.09% 
Natural Language Understanding  1  9.09% 
Image Categorization  1  9.09% 
Time Series Prediction  1  9.09% 
Component  Type 


🤖 No Components Found  You can add them if they exist; e.g. Mask RCNN uses RoIAlign 