Communication-Efficient Projection-Free Algorithm for Distributed Optimization

20 May 2018  ·  Yan Li, Chao Qu, Huan Xu ·

Distributed optimization has gained a surge of interest in recent years. In this paper we propose a distributed projection free algorithm named Distributed Conditional Gradient Sliding(DCGS). Compared to the state-of-the-art distributed Frank-Wolfe algorithm, our algorithm attains the same communication complexity under much more realistic assumptions. In contrast to the consensus based algorithm, DCGS is based on the primal-dual algorithm, yielding a modular analysis that can be exploited to improve linear oracle complexity whenever centralized Frank-Wolfe can be improved. We demonstrate this advantage and show that the linear oracle complexity can be reduced to almost the same order of magnitude as the communication complexity, when the feasible set is polyhedral. Finally we present experimental results on Lasso and matrix completion, demonstrating significant performance improvement compared to the existing distributed Frank-Wolfe algorithm.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here