Projection-Free Algorithms in Statistical Estimation

20 May 2018  ·  Yan Li, Chao Qu, Huan Xu ·

Frank-Wolfe algorithm (FW) and its variants have gained a surge of interests in machine learning community due to its projection-free property. Recently people have reduced the gradient evaluation complexity of FW algorithm to $\log(\frac{1}{\epsilon})$ for the smooth and strongly convex objective. This complexity result is especially significant in learning problem, as the overwhelming data size makes a single evluation of gradient computational expensive. However, in high-dimensional statistical estimation problems, the objective is typically not strongly convex, and sometimes even non-convex. In this paper, we extend the state-of-the-art FW type algorithms for the large-scale, high-dimensional estimation problem. We show that as long as the objective satisfies {\em restricted strong convexity}, and we are not optimizing over statistical limit of the model, the $\log(\frac{1}{\epsilon})$ gradient evaluation complexity could still be attained.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here