SIAM Journal on Control and Optimization, Vol.57, No.4, 2821-2842, 2019
DISTRIBUTED SUBGRADIENT-FREE STOCHASTIC OPTIMIZATION ALGORITHM FOR NONSMOOTH CONVEX FUNCTIONS OVER TIME-VARYING NETWORKS
In this paper we consider a distributed stochastic optimization problem without gradient/subgradient information for local objective functions and subject to local convex constraints. Objective functions may he nonsmooth and observed with stochastic noises, and the network for the distributed design is time-varying. By adding stochastic dithers to local objective functions and constructing randomized differences motivated by the Kiefer-Wolfowitz algorithm, we propose a distributed subgradient-free algorithm for finding the global minimizer with local observations. Moreover, we prove that the consensus of estimates and global minimization can he achieved with probability one over the time-varying network, and we obtain the convergence rate of the mean average of estimates as well. Finally, we give numerical examples to illustrate the performance of the proposed algorithms.
Keywords:distributed stochastic optimization;gradient-/subgadient-free algorithm;nonsmoothness;randomized differences