2022
DOI: 10.48550/arxiv.2205.01951
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Proximal ADMM for Nonconvex and Nonsmooth Optimization

Abstract: By enabling the nodes or agents to solve small-sized subproblems to achieve coordination, distributed algorithms are favored by many networked systems for efficient and scalable computation. While for convex problems, substantial distributed algorithms are available, the results for the more broad nonconvex counterparts are extremely lacking. This paper develops a distributed algorithm for a class of nonconvex and nonsmooth problems featured by i) a nonconvex objective formed by both separate and composite obj… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(6 citation statements)
references
References 31 publications
0
6
0
Order By: Relevance
“…When problem (P2) is nonconvex (i.e., 𝑓 𝑖 are nonconvex), the other multi-block ADMM variants are not applicable in general. To address a nonconvex multiblock (P2), [110] proposed a proximal ADMM with global convergence guarantee. Proximal ADMM takes the iterative scheme Proximal ADMM: Primal update:…”
Section: Inexact Dual Consensus Admmmentioning
confidence: 99%
See 4 more Smart Citations
“…When problem (P2) is nonconvex (i.e., 𝑓 𝑖 are nonconvex), the other multi-block ADMM variants are not applicable in general. To address a nonconvex multiblock (P2), [110] proposed a proximal ADMM with global convergence guarantee. Proximal ADMM takes the iterative scheme Proximal ADMM: Primal update:…”
Section: Inexact Dual Consensus Admmmentioning
confidence: 99%
“…The major difference lies in the dual update where a discounted factor (1 − 𝜏) is imposed. As argued in [110], the discounted dual update scheme is critical to ensure the convergence of the method for solving nonconvex (P2). Specifically, it ensures the boundness of Lagrangian multipliers and makes it possible to identify a proper sufficiently decreasing and lower bounded Lyapunov function, which is a general key step to establish convergence of ADMM and its variants in nonconvex setting.…”
Section: Inexact Dual Consensus Admmmentioning
confidence: 99%
See 3 more Smart Citations