2011
DOI: 10.1109/tip.2010.2103083
|View full text |Cite
|
Sign up to set email alerts
|

A Majorize–Minimize Strategy for Subspace Optimization Applied to Image Restoration

Abstract: This paper proposes accelerated subspace optimization methods in the context of image restoration. Subspace optimization methods belong to the class of iterative descent algorithms for unconstrained optimization. At each iteration of such methods, a stepsize vector allowing the best combination of several search directions is computed through a multidimensional search. It is usually obtained by an inner iterative second-order method ruled by a stopping criterion that guarantees the convergence of the outer alg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
121
0
1

Year Published

2013
2013
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 53 publications
(125 citation statements)
references
References 42 publications
3
121
0
1
Order By: Relevance
“…where x k and x k+1 respectively stands for the estimates at kth and (k+1)th iterations, An overview of existing subspace optimization methods [17] shows that D k usually includes a descent direction (e.g. gradient, Newton, truncated Newton direction) and a short history of previous directions.…”
Section: Subspace Optimization Methods In Hilbert Spacesmentioning
confidence: 99%
See 3 more Smart Citations
“…where x k and x k+1 respectively stands for the estimates at kth and (k+1)th iterations, An overview of existing subspace optimization methods [17] shows that D k usually includes a descent direction (e.g. gradient, Newton, truncated Newton direction) and a short history of previous directions.…”
Section: Subspace Optimization Methods In Hilbert Spacesmentioning
confidence: 99%
“…Chouzenoux et al [17] have addressed a discussion about the dimension of the subspace through simulation results on several image restoration problems. It is shown that in a Hilbert space, for a super memory gradient subspace (9), taking I = 2 i.e.…”
Section: Subspace Optimization Methods In Hilbert Spacesmentioning
confidence: 99%
See 2 more Smart Citations
“…This strategy can be generalized with the sequential subspace optimization (SESOP) [32], [33] for further accelerations, e.g. PCD-SESOP [11], [33] and PCD-SESOP-MM [34]. Variable splitting technique [35], [36] (also known as separable surrogate functionals (SSF) [33]) provides yet another powerful tool to minimize functions that consist of summation of two terms which are of different nature, e.g.…”
Section: B Approaches To Solve the Problemmentioning
confidence: 99%