# Accelerating Rescaled Gradient Descent: Fast Optimization of Smooth Functions

Ashia C. WilsonLester MackeyAndre Wibisono

We present a family of algorithms, called descent algorithms, for optimizing convex and non-convex functions. We also introduce a new first-order algorithm, called rescaled gradient descent (RGD), and show that RGD achieves a faster convergence rate than gradient descent provided the function is strongly smooth - a natural generalization of the standard smoothness assumption on the objective function... (read more)

PDF Abstract