Multi-block separable convex problems recently received considerable attention. This class of optimization problems minimizes a separable convex objective function with linear constraints. The algorithmic challenges come from the fact that the classic alternating direction method of multipliers (ADMM) for the problem is not necessarily convergent. However, it is observed that ADMM outperforms numerically many of its variants with guaranteed theoretical convergence. The goal of this paper is to develop convergent and computationally efficient algorithms for solving multi-block separable convex problems. We first characterize the solutions of the optimization problems by proximity operators of the convex functions involved in their objective function. We then design a two-step fixed-point iterative scheme for solving these problems based on the characterization. We further prove convergence of the iterative scheme and show that it has O( 1 k ) convergence rate in the ergodic sense and the sense of the partial primal-dual gap, where k denotes the iteration number. Moreover, we derive specific two-step fixed-point proximity algorithms (2SFPPA) from the proposed iterative scheme and establish their global convergence. Numerical experiments for solving the sparse MRI problem demonstrate the numerical efficiency of the proposed 2SFPPA.
1The alternating direction method of multipliers (ADMM) [13] was originally proposed for solving problem (1.1) with s = 2, and was recently widely used in the area of image processing [4,14,30,35]. Since ADMM requires inner iterations to solve its subproblems of ADMM, its linearized version (LADMM) was proposed and was successfully used in applications [12]. As s ≥ 3, one can directly extend the original ADMM (LADMM) to problem (1.1). Without an additional assumption, however, it was recently shown in [8] that the direct extension of ADMM to multi-block convex problems is not necessarily convergent, although it may work well in practice. Very recently, there were some investigations [10,17,18,22] on convergence of the extension of ADMM under some additional assumptions. Some researchers dedicated to modify ADMM or LADMM to make it convergent. For instance, the Jacobian-type ADMM was proposed in [11] for parallel computing, the semi-proximal ADMM proposed in [18,32] is for convex quadratic programming and conic programming, the Gaussian back substitution technique was proposed in [15,16] to make ADMM and LADMM converge. It was shown in [15,16] the attractiveness of the Gaussian back substitution technique for theoretical analysis on convergence of ADMM-type algorithms. However, the numerical results show that the correction step is time consuming and the ADMM (LADMM) with Gaussian back substitution may require more iterations than the direct extension of ADMM (LADMM) to achieve the same objective function value. Therefore, in this paper, we dedicate to establishing convergent and efficient algorithms.As shown in [1,19,21,24], the notion of proximity operators provides a useful tool for the...