2002
DOI: 10.1088/0266-5611/18/4/314
|View full text |Cite
|
Sign up to set email alerts
|

Efficient determination of multiple regularization parameters in a generalized L-curve framework

Abstract: The selection of multiple regularization parameters is considered in a generalized L-curve framework. Multiple-dimensional extensions of the L-curve for selecting multiple regularization parameters are introduced, and a minimum distance function (MDF) is developed for approximating the regularization parameters corresponding to the generalized corner of the L-hypersurface. For the single-parameter (i.e. L-curve) case, it is shown through a model that the regularization parameters minimizing the MDF essentially… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
154
0

Year Published

2003
2003
2018
2018

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 208 publications
(162 citation statements)
references
References 21 publications
0
154
0
Order By: Relevance
“…A linear combination of operators (each with its own regularization parameter i ) can be used as a side constraint (e.g., the Sobolev norm (14): ¥ i i 2 ʈL i rʈ 2 ). In such cases, a multidimensional extension to the L-curve method is required to optimize all the parameters simultaneously (the L-hypersurface was recently introduced to such effect (22)). …”
Section: Discussionmentioning
confidence: 99%
“…A linear combination of operators (each with its own regularization parameter i ) can be used as a side constraint (e.g., the Sobolev norm (14): ¥ i i 2 ʈL i rʈ 2 ). In such cases, a multidimensional extension to the L-curve method is required to optimize all the parameters simultaneously (the L-hypersurface was recently introduced to such effect (22)). …”
Section: Discussionmentioning
confidence: 99%
“…The truncation is expected to keep important components, since the directions removed from X k+ +1 are perpendicular to the current best approximation x k+1 , and also to the previous best approximations x k , x k−1 , …, x 1 . If the rotation and truncation are combined in one step, then the computational cost of the method is O(( + 1)(n + m + p 1 + · · · + p )), which quickly becomes smaller than the (re)orthogonalization cost as k grows.…”
Section: Subspace Expansion For Multiparameter Tikhonovmentioning
confidence: 99%
“…See for example [1,2,6,14,16,20,20]. In particular, there is no obvious multiparameter extension of the discrepancy principle.…”
Section: A Multiparameter Selection Strategymentioning
confidence: 99%
See 1 more Smart Citation
“…Methods for determining suitable regularization parameters for this minimization problem are discussed in [2,3,9,18].…”
Section: Multi-parameter Tikhonov Regularizationmentioning
confidence: 99%