2016
DOI: 10.1007/s10915-016-0318-2
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Multi-Block ADMM with o(1 / k) Convergence

Abstract: Abstract. This paper introduces a parallel and distributed extension to the alternating direction method of multipliers (ADMM) for solving convex problem:The algorithm decomposes the original problem into N smaller subproblems and solves them in parallel at each iteration. This Jacobian-type algorithm is well suited for distributed computing and is particularly attractive for solving certain large-scale problems.This paper introduces a few novel results. Firstly, it shows that extending ADMM straightforwardly … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

4
352
0
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 389 publications
(368 citation statements)
references
References 42 publications
4
352
0
1
Order By: Relevance
“…For the multi-block separable convex problems where K ≥ 3, it is known that the original ADMM can diverge for certain pathological problems [26]. Therefore, most research effort in this direction has been focused on either analyzing problems with additional conditions, or showing convergence for variants of the ADMM; see for example [26][27][28][29][30][31][32][33][34]. It is worth mentioning that when the objective function is not separable across the variables (e.g., the coupling function ℓ(·) appears in the objective), the convergence of the ADMM is still open, even in the case where K = 2 and f (·) is convex.…”
mentioning
confidence: 99%
“…For the multi-block separable convex problems where K ≥ 3, it is known that the original ADMM can diverge for certain pathological problems [26]. Therefore, most research effort in this direction has been focused on either analyzing problems with additional conditions, or showing convergence for variants of the ADMM; see for example [26][27][28][29][30][31][32][33][34]. It is worth mentioning that when the objective function is not separable across the variables (e.g., the coupling function ℓ(·) appears in the objective), the convergence of the ADMM is still open, even in the case where K = 2 and f (·) is convex.…”
mentioning
confidence: 99%
“…We compare it with the splitting method for separable convex programming (denoted by HTY) in [30], the full Jacobian decomposition of the augmented Lagrangian method (denoted by FJDALM) in [13], and the fully parallel ADMM (denoted by PADMM) in [14]. Through these numerical experiments, we want to illustrate that (1) suitable proximal terms can enhance PFPSM's numerical efficiency in practice, (2) larger values of the Glowinski relaxation factor can often accelerate PFPSM's convergence speed, and (3) compared with the other three ADMM-type methods the dynamically updated step size defined in (27) can accelerate the convergence of PFPSM.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…Compared to the iteration method in [13], the new iteration method PFPSM extends the scope of from 1 to the interval ( √ 3/2, 2/ √ 3), and larger values of are often more preferred in practice; see Figure 2. Furthermore, compared to the iteration method in [14], the new iteration method PFPSM removes the restriction imposed on the regularized matrices ( = 1, 2, 3); see condition (2.10) in [14].…”
Section: Remarkmentioning
confidence: 99%
See 2 more Smart Citations