A Robust Accelerated Optimization Algorithm for Strongly Convex Functions

13 Oct 2017  ·  Saman Cyrus, Bin Hu, Bryan Van Scoy, Laurent Lessard ·

This work proposes an accelerated first-order algorithm we call the Robust Momentum Method for optimizing smooth strongly convex functions. The algorithm has a single scalar parameter that can be tuned to trade off robustness to gradient noise versus worst-case convergence rate. At one extreme, the algorithm is faster than Nesterov's Fast Gradient Method by a constant factor but more fragile to noise. At the other extreme, the algorithm reduces to the Gradient Method and is very robust to noise. The algorithm design technique is inspired by methods from classical control theory and the resulting algorithm has a simple analytical form. Algorithm performance is verified on a series of numerical simulations in both noise-free and relative gradient noise cases.

PDF Abstract

Categories


Optimization and Control Systems and Control