2017
DOI: 10.1007/s10915-017-0612-7
|View full text |Cite
|
Sign up to set email alerts
|

Global Convergence of Unmodified 3-Block ADMM for a Class of Convex Minimization Problems

Abstract: The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2-block ADMM have been studied extensively in the literature. Specifically, it has been proven that the 2-block ADMM globally converges for any penalty parameter γ > 0. In this sense, the 2-block ADMM allows the parameter to be free, i.e., there is no need to restrict the value for the parameter when im… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 43 publications
(17 citation statements)
references
References 40 publications
0
17
0
Order By: Relevance
“…For applications of the sharing problem, see [13,34,40,41]. Our proximal ADMM-g for solving (1.1) with A N = I is described in Algorithm 2.…”
Section: Proximal Gradient-based Admm (Proximal Admm-g)mentioning
confidence: 99%
“…For applications of the sharing problem, see [13,34,40,41]. Our proximal ADMM-g for solving (1.1) with A N = I is described in Algorithm 2.…”
Section: Proximal Gradient-based Admm (Proximal Admm-g)mentioning
confidence: 99%
“…Proof : See Appendix A. Once θ t+1 is obtained, updates (17c) and (17d) can be performed in parallel and we have effectively transformed a 3-block ADMM in (11) to a 2-block ADMM, which converges under more general settings [20].…”
Section: Algorithmic Description and Implementationmentioning
confidence: 99%
“…As it can be seen in Figure 1 and 2, the BFGS-ADMM method demonstrates significant speed up in both datasets compared to other methods. Note that our first order variant also shows advantages compared to other first order methods, due to the fact that in our first order approximation, step size selection also takes into account the number of neighbors each agent has as in (20).…”
Section: Convergence Analysismentioning
confidence: 99%
“…The global convergence of IALM when applied for optimizing a convex problem with at most two blocks has been theoretically proven [40], [45], [46]. However, there are three blocks in Algorithm 1, and to the best of our knowledge, the theoretical global convergence of the IALM solver with three or more blocks is still unsolved [45].…”
Section: Discriminative Low-rank Representation 1) Problem Formulationmentioning
confidence: 99%