Natasha: Faster Non-Convex Stochastic Optimization Via Strongly Non-Convex Parameter

ICML 2017  ·  Zeyuan Allen-Zhu ·

Given a nonconvex function that is an average of $n$ smooth functions, we design stochastic first-order methods to find its approximate stationary points. The convergence of our new methods depends on the smallest (negative) eigenvalue $-\sigma$ of the Hessian, a parameter that describes how nonconvex the function is. Our methods outperform known results for a range of parameter $\sigma$, and can be used to find approximate local minima. Our result implies an interesting dichotomy: there exists a threshold $\sigma_0$ so that the currently fastest methods for $\sigma>\sigma_0$ and for $\sigma<\sigma_0$ have different behaviors: the former scales with $n^{2/3}$ and the latter scales with $n^{3/4}$.

PDF Abstract ICML 2017 PDF ICML 2017 Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here