Paper

Understanding Limitation of Two Symmetrized Orders by Worst-case Complexity

Update order is one of the major design choices of block decomposition algorithms. There are at least two classes of deterministic update orders: nonsymmetric (e.g. cyclic order) and symmetric (e.g. Gaussian back substitution or symmetric Gauss-Seidel). Recently, Coordinate Descent (CD) with cyclic order was shown to be $O(n^2)$ times slower than randomized versions in the worst-case. A natural question arises: can the symmetrized orders achieve faster convergence rates than the cyclic order, or even getting close to the randomized versions? In this paper, we give a negative answer to this question. We show that both Gaussian back substitution (GBS) and symmetric Gauss-Seidel (sGS) suffer from the same slow convergence issue as the cyclic order in the worst case. In particular, we prove that for unconstrained problems, both GBS-CD and sGS-CD can be $O(n^2)$ times slower than R-CD. Despite unconstrained problems, we also empirically study linearly constrained problems with quadratic objective: we empirically demonstrate that the convergence speed of GBS-ADMM and sGS-ADMM can be roughly $O(n^2)$ times slower than randomly permuted ADMM.

Results in Papers With Code
(↓ scroll down to see all results)