Dykstra's Algorithm, ADMM, and Coordinate Descent: Connections, Insights, and Extensions

NeurIPS 2017  ·  Tibshirani Ryan J. ·

We study connections between Dykstra's algorithm for projecting onto an intersection of convex sets, the augmented Lagrangian method of multipliers or ADMM, and block coordinate descent. We prove that coordinate descent for a regularized regression problem, in which the (separable) penalty functions are seminorms, is exactly equivalent to Dykstra's algorithm applied to the dual problem. ADMM on the dual problem is also seen to be equivalent, in the special case of two sets, with one being a linear subspace. These connections, aside from being interesting in their own right, suggest new ways of analyzing and extending coordinate descent. For example, from existing convergence theory on Dykstra's algorithm over polyhedra, we discern that coordinate descent for the lasso problem converges at an (asymptotically) linear rate. We also develop two parallel versions of coordinate descent, based on the Dykstra and ADMM connections.

PDF Abstract NeurIPS 2017 PDF NeurIPS 2017 Abstract
No code implementations yet. Submit your code now

Categories


Computation Optimization and Control

Datasets


  Add Datasets introduced or used in this paper