2011
DOI: 10.1007/s10444-011-9254-8
|View full text |Cite
|
Sign up to set email alerts
|

A splitting algorithm for dual monotone inclusions involving cocoercive operators

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
515
0
2

Year Published

2012
2012
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 435 publications
(520 citation statements)
references
References 20 publications
3
515
0
2
Order By: Relevance
“…Also, since the proofs in Section 4 are derived in the general framework of monotone and nonexpansive operators, it would be straightforward to adapt the study to solve more general monotone inclusions, where, in (6), the subgradients would be replaced by arbitrary maximally monotone operators, the gradients by cocoercive operators and the proximity operators by resolvents. This more general framework has been considered in the later work [24]. The intention of the author was to keep the study accessible to the practitioner interested in the optimization problem (1).…”
Section: Problem Formulationmentioning
confidence: 99%
See 1 more Smart Citation
“…Also, since the proofs in Section 4 are derived in the general framework of monotone and nonexpansive operators, it would be straightforward to adapt the study to solve more general monotone inclusions, where, in (6), the subgradients would be replaced by arbitrary maximally monotone operators, the gradients by cocoercive operators and the proximity operators by resolvents. This more general framework has been considered in the later work [24]. The intention of the author was to keep the study accessible to the practitioner interested in the optimization problem (1).…”
Section: Problem Formulationmentioning
confidence: 99%
“…In return, this quest of practitioners for efficient minimization methods has caused a renewed interest among mathematicians around splitting methods in monotone and nonexpansive operator theory, as can be judged from the numerous recent contributions, e.g. [13][14][15][16][17][18][19][20][21][22][23][24]. The most classical operator splitting methods to minimize the sum of two convex functions are the forward-backward method, proposed in [2] and further developed in [3,4,7,[25][26][27][28], and the Douglas-Rachford method [3,6,7,22].…”
Section: Introductionmentioning
confidence: 99%
“…We compare the proposed algorithm with state-of-the-art optimization methods. The first one is the proximal primal-dual method first proposed in [13] and further extended in [24,40] (here designated by CPCV). The second one is another proximal primal-dual approach, namely the Monotone+Lipschitz Forward-Backward-Forward (M+LFBF) algorithm which was proposed in [23] (see also [8] for extensions).…”
Section: Simulation Resultsmentioning
confidence: 99%
“…Before we come to the generalization of CP to Banach spaces we also like to refer to other generalizations of this method: In addition to the above-mentioned preconditioned versions, there exist extended variants for solving monotone inclusion problems ( [14,78]) and also to the case of nonlinear operators T ( [80]). Recently, Lorenz and Pock ( [55]) proposed a quite general forward-backward algorithm for monotone inclusion problems with CP as a special case.…”
Section: Chambolle-pock's First-order Primal-dual Algorithmmentioning
confidence: 99%
“…Now, in particular if the corresponding Tikhonov-type functional which has to be minimized is nonsmooth the proposed generalization CP-BS is an attractive method for this purpose. We also refer to [14,55,62,78,80] for other generalizations and extensions of CP .…”
Section: Introductionmentioning
confidence: 99%