2017
DOI: 10.1007/s10851-017-0724-6
|View full text |Cite
|
Sign up to set email alerts
|

Accelerated Alternating Descent Methods for Dykstra-Like Problems

Abstract: This paper extends recent results by the first author and T. Pock (ICG, TU Graz, Austria) on the acceleration of alternating minimization techniques for quadratic plus nonsmooth objectives depending on two variables. We discuss here the strongly convex situation, and how "fast" methods can be derived by adapting the overrelaxation strategy of Nesterov for projected gradient descent. We also investigate slightly more general alternating descent methods, where several descent steps in each variable are alternati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 14 publications
(24 citation statements)
references
References 22 publications
0
24
0
Order By: Relevance
“…This can severely impact the required computational load and the convergence time to a meaningful result. To mitigate this issue, in the following section we propose to use few reasonable approximations to turn the minimization of (22) into an optimization problem separable in A C and A D . This will remove the interdependency between the different image pixels, and allow the extension of the ALS strategy to consider the optimization w.r.t.…”
Section: Optimizing With Respect To a At The I-th Iterationmentioning
confidence: 99%
See 2 more Smart Citations
“…This can severely impact the required computational load and the convergence time to a meaningful result. To mitigate this issue, in the following section we propose to use few reasonable approximations to turn the minimization of (22) into an optimization problem separable in A C and A D . This will remove the interdependency between the different image pixels, and allow the extension of the ALS strategy to consider the optimization w.r.t.…”
Section: Optimizing With Respect To a At The I-th Iterationmentioning
confidence: 99%
“…Hypothesis A1 consists of assuming that the inner product RE C , RE D between the residuals/reconstruction errors RE C and RE D in the coarse and detail image scales is comparatively small, when compared to the first two terms of the cost function (22). To illustrate the validity of this claim, we compare here the values of RE C , RE D with those of the first two terms of the cost function, given by RE C The results are presented below in Table IV.…”
Section: Hypothesismentioning
confidence: 99%
See 1 more Smart Citation
“…We will show in this paper that such a generalization can be carried out in the framework introduced in [31]. Besides, the convergence speed issue has already been successfully tackled in the convex optimization by using inertia [40,23,5,15,22] and for some nonconvex proximal schemes [32,39]. We will show in this paper that ASAP and its Bregman-distance based generalization can also incorporate such a speed-up strategy, which leads to our proposed method.…”
Section: Biconvex Optimizationmentioning
confidence: 88%
“…If α k 2 = 0, (15) and (16) are the same, so (16) holds whenever α k 2 ≥ 0. Now, since b X is L X -convex, one can lowerbound the previous inequality using (4), which gives:…”
Section: Convergence Analysismentioning
confidence: 99%