The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2016
DOI: 10.1007/978-3-319-41589-5_4
|View full text |Cite
|
Sign up to set email alerts
|

Convergence Rate Analysis of Several Splitting Schemes

Abstract: Splitting schemes are a class of powerful algorithms that solve complicated monotone inclusions and convex optimization problems that are built from many simpler pieces. They give rise to algorithms in which the simple pieces of the decomposition are processed individually. This leads to easily implementable and highly parallelizable algorithms, which often obtain nearly state-of-the-art performance.In the first part of this paper, we analyze the convergence rates of several general splitting algorithms and pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
239
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 163 publications
(244 citation statements)
references
References 58 publications
5
239
0
Order By: Relevance
“…In this case, ADMM is known to converge under very mild conditions; see [7] and [8]. Under the same conditions, several recent works [20][21][22] have shown that the ADMM converges with the sublinear rate of O( …”
mentioning
confidence: 96%
“…In this case, ADMM is known to converge under very mild conditions; see [7] and [8]. Under the same conditions, several recent works [20][21][22] have shown that the ADMM converges with the sublinear rate of O( …”
mentioning
confidence: 96%
“…Moreover, it was shown in [26] that the sequence x t − x * is nonincreasing and that Sx t − x t 2 = o(1/t), assuming only that the sequence τ t = ρ t (1 − ρ t ) is bounded away from 0. Conveniently, compositions of averaged operators are easily seen to be averaged.…”
Section: Forward and Backward Stepsmentioning
confidence: 99%
“…On the other hand, most users of (DRA) have fixed γ = 1 to focus on the estimation of the scaling parameter λ as seen below. More insight on relaxed versions of (FB), (DRA) and (PRA) and their theoretical rates of convergence may be found in the recent study by Davis and Yin and companion papers [26].…”
Section: Algorithmic Enhancementsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, it was shown in [6] that the scheme (1.6) is not necessarily convergent. The convergence rate of ADMM and its extension are analysed in [9,28,30,33,32].…”
Section: LI and X M Yuanmentioning
confidence: 99%