Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2016
DOI: 10.1017/s096249291600009x
|View full text |Cite
|
Sign up to set email alerts
|

An introduction to continuous optimization for imaging

Abstract: A large number of imaging problems reduce to the optimization of a cost function, with typical structural properties. The aim of this paper is to describe the state of the art in continuous optimization methods for such problems, and present the most successful approaches and their interconnections. We place particular emphasis on optimal first-order schemes that can deal with typical non-smooth and large-scale objective functions used in imaging problems. We illustrate and compare the different algorithms usi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
451
1
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 467 publications
(468 citation statements)
references
References 318 publications
2
451
1
1
Order By: Relevance
“…Since alternating minimizations are a variant of Forward-Backward splitting methods, it is clear that one can expect good convergence rates by adapting standard methods [3,19,12]. This is what we establish in Theorem 1, extending a result of [11] in the non strongly convex case.…”
Section: The Problemsupporting
confidence: 70%
See 3 more Smart Citations
“…Since alternating minimizations are a variant of Forward-Backward splitting methods, it is clear that one can expect good convergence rates by adapting standard methods [3,19,12]. This is what we establish in Theorem 1, extending a result of [11] in the non strongly convex case.…”
Section: The Problemsupporting
confidence: 70%
“…2.2.2] will provide efficient convergence rates. A derivation from an equality similar to (14) is provided in [12,Appendix B]. We provide here an adaption of that proof to our particular situation (only the parameters are slightly differing, so that we will sketch most of the arguments).…”
Section: Accelerated Alternating Minimizationmentioning
confidence: 99%
See 2 more Smart Citations
“…An algorithm that can cope well with the aforementioned challenges is the primal-dual hybrid gradient algorithm (PDHG) [3][4][5][6] which can decompose the problem into a sequence of easy operations such as matrix-vector products and proximal operators that have closed-form solutions. However, as the PDHG updates are parallel, every iteration of PDHG is computationally demanding for the large variable sizes (easily exceeding 100 million) encountered with modern PET scanners.…”
Section: Introductionmentioning
confidence: 99%