2016
DOI: 10.1007/s10107-016-1034-2
|View full text |Cite
|
Sign up to set email alerts
|

On the linear convergence of the alternating direction method of multipliers

Abstract: We analyze the convergence rate of the alternating direction method of multipliers (ADMM) for minimizing the sum of two or more nonsmooth convex separable functions subject to linear constraints. Previous analysis of the ADMM typically assumes that the objective function is the sum of only two convex functions defined on two separable blocks of variables even though the algorithm works well in numerical experiments for three or more blocks. Moreover, there has been no rate of convergence analysis for the ADMM … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

7
407
0
3

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 499 publications
(417 citation statements)
references
References 52 publications
(67 reference statements)
7
407
0
3
Order By: Relevance
“…The convergence of said augmented Lagrangian algorithm is on the order of O(1/N ) similar to other primal-dual approaches [12]. This was shown by Hong and Luo [25] for multi-block alternating direction method of multipliers algorithms such as our algorithm. This convergence relies on the decomposition of cost function and constraints across each block having particular, restrictive properties which are upheld for all inter-node flows,…”
Section: Augmented Lagrangian Approachsupporting
confidence: 74%
See 1 more Smart Citation
“…The convergence of said augmented Lagrangian algorithm is on the order of O(1/N ) similar to other primal-dual approaches [12]. This was shown by Hong and Luo [25] for multi-block alternating direction method of multipliers algorithms such as our algorithm. This convergence relies on the decomposition of cost function and constraints across each block having particular, restrictive properties which are upheld for all inter-node flows,…”
Section: Augmented Lagrangian Approachsupporting
confidence: 74%
“…This guarantees linear convergence under weaker restrictions [25] and provides a theoretical justification for the use of only a single Chambolle iteration per update step.…”
Section: Augmented Lagrangian Approachmentioning
confidence: 84%
“…It appears that linear convergence is possible when at least one of the block function is strongly convex. A recent study by Hong and Luo [48] made new progress in relaxing these assumptions and extending too to more than two blocks. Their separable model extends the M-model to include the S-model used in the former section.…”
Section: Algorithm 8 Admmmentioning
confidence: 99%
“…Multi-block variants of ADMM may eliminate such need. However, the convergence is not guaranteed or requires additional assumptions [30], [31], [32]. Validating these assumptions for specific storage control problem instances may lead to simpler algorithm which has similar convergence properties.…”
Section: Scalabilitymentioning
confidence: 99%