IEEE Transactions on Automatic Control, Vol.63, No.12, 4126-4139, 2018
Tight Global Linear Convergence Rate Bounds for Operator Splitting Methods
In this paper, we establish necessary and sufficient conditions for global linear convergence rate bounds in operator splitting methods for a general class of convex optimization problems, where the associated fixed-point operator is strongly quasi-nonexpansive. We also provide a tight bound on the achievable convergence rate. Most existing results establishing global linear convergence in such methods require restrictive assumptions regarding strong convexity and smoothness of the constituent functions in the optimization problem. However, there are several examples in the literature showing that linear convergence is possible even when these properties do not hold. We provide a unifying analysis method for establishing global linear convergence based on linear regularity and show that many existing results are special cases of our approach. Moreover, we propose a novel linearly convergent splitting method for linear programming.
Keywords:Douglas-Rachford splitting;linear convergence;operator splitting methods;optimization algorithms