2018
DOI: 10.1007/s10589-018-0011-5
|View full text |Cite
|
Sign up to set email alerts
|

A block coordinate variable metric linesearch based proximal gradient method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
13
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(13 citation statements)
references
References 48 publications
0
13
0
Order By: Relevance
“…, I k . Nevertheless, the approximation function (20) may be easier to optimize than (21) as the component functions are separable and each component function is a scalar function. This reflects the universal tradeoff between the number of iterations and the complexity per iteration.…”
Section: Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…, I k . Nevertheless, the approximation function (20) may be easier to optimize than (21) as the component functions are separable and each component function is a scalar function. This reflects the universal tradeoff between the number of iterations and the complexity per iteration.…”
Section: Algorithmmentioning
confidence: 99%
“…On the convergence speed. The mild assumptions on the approximation functions allow us to design an approximation function that exploits the original problem's structure (such as the partial convexity in (20)- (21)) and this leads to faster convergence. The use of line search also attributes to a faster convergence than decreasing stepsizes used in literature, for example [15,18].…”
Section: Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…This algorithm uses the update of the gradient direction to replace the calculation of the orthogonal projection, which reduces the computational complexity of the greedy pursuit algorithms. Their successors include the Newton pursuit (NP) [19] algorithm, the conjugate gradient pursuit (CGP) [20] algorithm, the approximate conjugate gradient pursuit (ACGP) [21] algorithm and the variable metric method-based gradient pursuit (VMMGP) [22] algorithm. These methods reduce the computational complexity and storage space of the traditional greedy algorithm in terms of the large-scale recovery problem but the reconstruction performance still requires improvement.…”
Section: Introductionmentioning
confidence: 99%
“…In [1], using the theory about the functions satisfying the Kurdyka Lojasiewicz assumption, a block coordinate approach for minimizing the sum of a differentiable, not necessarily convex, function and non-smooth block separable terms is addressed. This problem arises, for example, in fluorescence microscopy, in emission tomography and in optical astronomy, where we have to perform blind deconvolution of images corrupted by Poisson noise and both the point spread function and the image have to be recovered.…”
mentioning
confidence: 99%