2015
DOI: 10.1007/s40305-015-0084-0
|View full text |Cite
|
Sign up to set email alerts
|

Convergence Analysis of L-ADMM for Multi-block Linear-Constrained Separable Convex Minimization Problem

Abstract: We focus on the convergence analysis of the extended linearized alternating direction method of multipliers (L-ADMM) for solving convex minimization problems with three or more separable blocks in the objective functions. Previous convergence analysis of the L-ADMM needs to reduce the multi-block convex minimization problems to two blocks by grouping the variables. Moreover, there has been no rate of convergence analysis for the L-ADMM. In this paper, we construct a counter example to show the failure of conve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 12 publications
(13 reference statements)
0
7
0
Order By: Relevance
“…The global convergence of the iterative sequence (1.5) is further proved (see [8]). Under strongly convex conditions, Yao and Cheng obtained the global convergence of the above algorithm (1.5) under the condition of 3 m  (see [9]).…”
Section: Introductionmentioning
confidence: 92%
See 2 more Smart Citations
“…The global convergence of the iterative sequence (1.5) is further proved (see [8]). Under strongly convex conditions, Yao and Cheng obtained the global convergence of the above algorithm (1.5) under the condition of 3 m  (see [9]).…”
Section: Introductionmentioning
confidence: 92%
“…) [7] proves that the algorithm is convergent. In [8], Feng et al directly extended the linearized alternating direction multiplier method in the iterative algorithm (1.4) to the case with three separable operators to solve the problem (1.1). The iterative sequence (DLADMM) in this direct extension case is written as the following iterative form…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the Gauss-Seidel update requires additional assumptions for convergence, though it can empirically perform better than the Jacobi method if it happens to converge. Counterexamples are constructed in [4,12] to show possible divergence of Gauss-Seidel block coordinate update for linearly constrained problems. To guarantee convergence of the Gauss-Seidel update, many existing works assume strong convexity on the objective or part of it.…”
Section: Jacobian and Gauss-seidel Block Coordinate Updatementioning
confidence: 99%
“…Empirically, the latter one usually performs better. However, theoretically, the latter case may fail to converge when m ≥ 3, as shown in [4,12]. Therefore, we will design a mixing matrix W between the above two choices to theoretically guarantee convergence and also maintain practically nice performance.…”
Section: Algorithmmentioning
confidence: 99%