An Equivalent Circuit Approach to Distributed Optimization
Distributed optimization is an essential paradigm to solve large-scale optimization problems in modern applications where big-data and high-dimensionality creates a computational bottleneck. Distributed optimization algorithms that exhibit fast convergence allow us to fully utilize computing resources and effectively scale to larger optimization problems in a myriad of areas ranging from machine learning to power systems. In this work, we introduce a new centralized distributed optimization algorithm (ECADO) inspired by an equivalent circuit model of the distributed problem. The equivalent circuit (EC) model provides a physical analogy to derive new insights to develop a fast-convergent algorithm. The main contributions of this approach are: 1) a weighting scheme based on a circuit-inspired aggregate sensitivity analysis, and 2) an adaptive step-sizing derived from a stable, Backward-Euler numerical integration. We demonstrate that ECADO exhibits faster convergence compared to state-of-the art distributed optimization methods and provably converges for nonconvex problems. We leverage the ECADO features to solve convex and nonconvex optimization problems with large datasets such as: distributing data for logistic regression, training a deep neural network model for classification, and solving a high-dimensional problem security-constrained optimal power flow problem. Compared to state-of-the-art centralized methods, including ADMM, centralized gradient descent, and DANE, this new ECADO approach is shown to converge in fewer iterations.
PDF Abstract