COLA: Communication-censored Linearized ADMM for Decentralized Consensus Optimization

Citation:

Weiyu Li, Yaohua Liu, Zhi Tian, and Qing Ling. 2019. “COLA: Communication-censored Linearized ADMM for Decentralized Consensus Optimization.” In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Pp. 5237-5241. Publisher's Version

Abstract:

This paper proposes a communication- and computation-efficient algorithm to solve a convex consensus optimization problem defined over a decentralized network. A remarkable existing algorithm to solve this problem is the alternating direction method of multipliers (ADMM), in which at every iteration every node updates its local variable through combining neighboring variables and solving an optimization subproblem. The proposed algorithm, called as communication-censored linearized ADMM (COLA), leverages a linearization technique to reduce the iteration-wise computation cost of ADMM and uses a communication-censoring strategy to alleviate the communication cost. To be specific, COLA introduces successive linearization approximations to the local cost functions such that the resultant computation is first-order and light-weight. Since the linearization technique slows down the convergence speed, COLA further adopts the communication-censoring strategy to avoid transmissions of less informative messages. A node is allowed to transmit only if the distance between the current local variable and its previously transmitted one is larger than a censoring threshold. We establish convergence as well as sublinear and linear rates of convergence of COLA, and demonstrate its satisfactory communication-computation tradeoff with numerical experiments.
Last updated on 06/25/2022