2020
DOI: 10.1287/moor.2019.1008
|View full text |Cite
|
Sign up to set email alerts
|

The Proximal Alternating Direction Method of Multipliers in the Nonconvex Setting: Convergence Analysis and Rates

Abstract: We propose two numerical algorithms in the fully nonconvex setting for the minimization of the sum of a smooth function and the composition of a nonsmooth function with a linear operator. The iterative schemes are formulated in the spirit of the proximal alternating direction method of multipliers and its linearized variant, respectively. The proximal terms are introduced via variable metrics, a fact that allows us to derive new proximal splitting algorithms for nonconvex structured optimization problems, as p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 54 publications
(11 citation statements)
references
References 43 publications
0
11
0
Order By: Relevance
“…An essential tool for deriving the rates of convergence is the following lemma, the proof of which can be found in Reference 50, Lemma 15.…”
Section: An Optimization Model With Convergence Guaranteesmentioning
confidence: 99%
“…An essential tool for deriving the rates of convergence is the following lemma, the proof of which can be found in Reference 50, Lemma 15.…”
Section: An Optimization Model With Convergence Guaranteesmentioning
confidence: 99%
“…Splitting algorithms for solving problems of the form (1.2) have been considered in [19], under the assumption that H is twice continuously differentiable with bounded Hessian, in [25], under the assumption that one of the summands is convex and continuous on its effective domain, and in [13], as a particular case of a general nonconvex proximal ADMM algorithm. We would like to mention in this context also [10] for the case when A is nonlinear.…”
Section: Problem Formulation and Motivationmentioning
confidence: 99%
“…Remark 1. piq In case Gpyq " 0 and Hpx, yq " Hpxq for any px, yq P R mˆRq , where H : R m Ñ R is a Fréchet differentiable function with Lipschitz continuous gradient, Algorithm 1 gives rise to an iterative scheme which has been proposed in [13] for solving the optimization problem (1.2). This reads for any n ě 0 z n`1 P prox β´1F`A x n`β´1 u nx n`1 :" x n´τ´1`∇ H px n q`A T u n`β A T pAx n´zn`1 qȗ n`1 :" u n`σ β pAx n`1´zn`1 q .…”
Section: The Algorithmmentioning
confidence: 99%
See 2 more Smart Citations