2018
DOI: 10.48550/arxiv.1803.02919
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Proximal Activation of Smooth Functions in Splitting Algorithms for Convex Image Recovery

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…denotes the proximal operator of f . For a nonempty closed and convex set C ⊆ H, one has that P C = prox δ C , where The strongly convergent forward-backward algorithm with variable step sizes for solving (12) that we propose in this section will be formulated as a particular case of (10) for R n = J γnA (Id −γ n B), for n ≥ 0, where inf n≥0 γ n > 0. To this end we will prove that (R n ) n≥0 fulfills the asymptotic condition (11) (see [14,Corollary 17]).…”
Section: A Strongly Convergent Forward-backward Algorithm With Variab...mentioning
confidence: 99%
See 1 more Smart Citation
“…denotes the proximal operator of f . For a nonempty closed and convex set C ⊆ H, one has that P C = prox δ C , where The strongly convergent forward-backward algorithm with variable step sizes for solving (12) that we propose in this section will be formulated as a particular case of (10) for R n = J γnA (Id −γ n B), for n ≥ 0, where inf n≥0 γ n > 0. To this end we will prove that (R n ) n≥0 fulfills the asymptotic condition (11) (see [14,Corollary 17]).…”
Section: A Strongly Convergent Forward-backward Algorithm With Variab...mentioning
confidence: 99%
“…f is identical zero and g is the function in the objective); (2) to divide the objective into two parts and evaluate one of the two smooth functions via its proximal operator, hence, end up with a proximal-gradient scheme. We pursued both approaches, by taking also into account [12], which suggests that the evaluation of a smooth objective via its proximal operator may be advantageous in terms of computational performance compared to evaluating the whole objective through its gradient.…”
Section: A Variational Minimization Problemmentioning
confidence: 99%