2017
DOI: 10.1007/s11464-017-0631-6
|View full text |Cite
|
Sign up to set email alerts
|

Convergence of ADMM for multi-block nonconvex separable optimization models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 41 publications
(24 citation statements)
references
References 26 publications
0
22
0
Order By: Relevance
“…where the second equality follows from the definition of in (27), and the second inequality comes from (29) and (33). Then, assertion (52) is thus proved.…”
Section: Lemma 10mentioning
confidence: 73%
See 4 more Smart Citations
“…where the second equality follows from the definition of in (27), and the second inequality comes from (29) and (33). Then, assertion (52) is thus proved.…”
Section: Lemma 10mentioning
confidence: 73%
“…We compare it with the splitting method for separable convex programming (denoted by HTY) in [30], the full Jacobian decomposition of the augmented Lagrangian method (denoted by FJDALM) in [13], and the fully parallel ADMM (denoted by PADMM) in [14]. Through these numerical experiments, we want to illustrate that (1) suitable proximal terms can enhance PFPSM's numerical efficiency in practice, (2) larger values of the Glowinski relaxation factor can often accelerate PFPSM's convergence speed, and (3) compared with the other three ADMM-type methods the dynamically updated step size defined in (27) can accelerate the convergence of PFPSM. All codes were written by Matlab R2010a and conducted on a ThinkPad notebook with Pentium(R) Dual-Core CPU T4400@2.2 GHz, 4 GB of memory.…”
Section: Numerical Resultsmentioning
confidence: 99%
See 3 more Smart Citations