화학공학소재연구정보센터
Automatica, Vol.83, 162-169, 2017
Convergence rate analysis of distributed optimization with projected subgradient algorithm
In this paper, we revisit the consensus-based projected subgradient algorithm proposed for a common set constraint. We show that the commonly adopted non-summable and square-summable diminishing step sizes of subgradients can be relaxed to be only non-summable, if the constrained optimum set is bounded. More importantly, for a strongly convex aggregate cost with different types of step sizes, we provide a systematical analysis to derive the asymptotic upper bound of convergence rates in terms of the optimum residual, and select the best step sizes accordingly. Our result shows that a convergence rate of 0(1/root k) can be achieved with a step size 0(1/k). (C) 2017 Elsevier Ltd. All rights reserved.