Variance Reduction-Boosted Byzantine Robustness in Decentralized Stochastic Optimization

Citation:

Jie Peng, Weiyu Li, and Qing Ling. 2022. “Variance Reduction-Boosted Byzantine Robustness in Decentralized Stochastic Optimization.” In ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Pp. 4283-4287. Publisher's Version

Abstract:

We consider the Byzantine-robust decentralized stochastic optimization problem, where every agent periodically communicates with its neighbors to exchange the local models, and then updates its own local model by stochastic gradient descent. However, an unknown number of the agents are Byzantine, and perform adversarially during the optimization process. Few works have considered this challenging scenario, and an existing method termed DECEMBER is unable to simultaneously achieve linear convergence speed and small learning error due to the stochastic noise. To eliminate the negative effect of the stochastic noise, we introduce two variance reduction methods, stochastic average gradient algorithm (SAGA) and loopless stochastic variance-reduced gradient (LSVRG), to Byzantine-robust decentralized stochastic optimization. The two resulting methods, DECEMBER-SAGA and DECEMBER-LSVRG, enjoy both linear convergence speeds and small learning errors. Numerical experiments demonstrate their effectiveness.
Last updated on 06/25/2022