2012
DOI: 10.1007/s10589-012-9525-4
|View full text |Cite
|
Sign up to set email alerts
|

A cyclic projected gradient method

Abstract: In recent years, convex optimization methods were successfully applied for various image processing tasks and a large number of first-order methods were designed to minimize the corresponding functionals. Interestingly, it was shown recently in [24] that the simple idea of so-called "superstep cycles" leads to very efficient schemes for time-dependent (parabolic) image enhancement problems as well as for steady state (elliptic) image compression tasks. The "superstep cycles" approach is similar to the nonstati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2013
2013
2016
2016

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 37 publications
0
5
0
Order By: Relevance
“…Line search strategies can be incorporated [83,87,120]. Finally we mention Barzilei-Borwein step size rules [11] based on a Quasi-Newton approach and relatives, see [74] for an overview and the cyclic proximal gradient algorithm related to the cyclic Richardson algorithm [158].…”
Section: Accelerated Algorithmsmentioning
confidence: 99%
“…Line search strategies can be incorporated [83,87,120]. Finally we mention Barzilei-Borwein step size rules [11] based on a Quasi-Newton approach and relatives, see [74] for an overview and the cyclic proximal gradient algorithm related to the cyclic Richardson algorithm [158].…”
Section: Accelerated Algorithmsmentioning
confidence: 99%
“…where Δr −1 = r − r −1 and Δx −1 = x − x −1 , with initial guesses r 0 = 0 and x 0 = 0. Nowadays the improvements of the original BBM have been developed in many literature works [24][25][26][27][28][29][30] to treat different ill-posed and inverse problems.…”
Section: A Generalized Sdm and Its Optimizationmentioning
confidence: 99%
“…Upon comparing (53) and (30) we can derive (37) with the steplength being positive. Moreover, as a direct result of (30) and (37) we have r ⋅ u > 0.…”
Section: Journal Of Applied Mathematicsmentioning
confidence: 99%
“…In [3] numerical evidence has been also provided indicating remarkable gain in the convergence rate over the classical BarzilaiBorwein (BB) step-length rule [4]. Since in the last years promising image reconstruction algorithms have been designed by exploiting BB-based rules within gradient methods [5,6,7,8,9,10,11], it is worthwhile to investigate if useful acceleration can be achieved with the new step-length selection idea. In particular, we focus on the algorithm for image deconvolution in microscopy provided by the Scaled Gradient Projection (SGP) method recently developed in [12], that can be appropriately modified for managing the step-length rule proposed in [3].…”
Section: Introductionmentioning
confidence: 99%