1993
DOI: 10.1007/bf01385687
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic optimality of generalized cross-validation for choosing the regularization parameter

Abstract: Summary. Let f,x be the regularized solution of a general, linear operator equation, Kfo = g, from discrete, noisy data Yi = g(xi) + ei, i = 1 ..... n, where el are uncorrelated random errors. We consider the prominent method of generalized cross-validation (GCV) for choosing the crucial regularization parameter 2. The practical GCV estimate )~v and its "expected" counterpart 2v are defined as the minimizers of the GCV function V(2) and EV(2), respectively, where E denotes expectation. We investigate the asymp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
109
0

Year Published

2004
2004
2018
2018

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 43 publications
(109 citation statements)
references
References 26 publications
0
109
0
Order By: Relevance
“…The regularization parameter, λ, is a scalar for properly weighting the first term (data fidelity cost) against the second term (regularization cost). Generally speaking, choosing λ could be either done manually, using visual inspection, or automatically using methods like Generalized Cross-Validation [28], [29], L-curve [30], or other techniques. How to choose such regularization parameters is in itself a vast topic, which we will not treat in the present paper.…”
Section: B Map Approach To Multi-frame Image Reconstructionmentioning
confidence: 99%
“…The regularization parameter, λ, is a scalar for properly weighting the first term (data fidelity cost) against the second term (regularization cost). Generally speaking, choosing λ could be either done manually, using visual inspection, or automatically using methods like Generalized Cross-Validation [28], [29], L-curve [30], or other techniques. How to choose such regularization parameters is in itself a vast topic, which we will not treat in the present paper.…”
Section: B Map Approach To Multi-frame Image Reconstructionmentioning
confidence: 99%
“…For Tikhonov regularization, it is known [57,95,98] that, with uncorrelated errors, GCV is asymptotically optimal with respect to the prediction risk as the number of data points m → ∞, i.e. the inefficiency goes to 1.…”
Section: Generalized Cross-validationmentioning
confidence: 99%
“…the inefficiency goes to 1. In addition, if the unknown solution x is not too smooth relative to the operator, then GCV is order optimal for the X -norm risk E x δ n − x 2 [98,139,143]. The condition required for this here is ν ≤ µ + 1/2, and otherwise GCV is order sub-optimal.…”
Section: Generalized Cross-validationmentioning
confidence: 99%
See 1 more Smart Citation
“…Two important characteristics of (1.1)-(1.2) are that (1) the operator A is unknown and must be estimated from the data and (2) the distribution A variety of ways to choose regularization parameters are known in mathematics and numerical analysis. Engl, Hanke, and Neubauer (1996), Mathé and Pereverzev (2003), Bauer and Hohage (2005), Wahba (1977), and Lukas (1993Lukas ( , 1998 describe many. Most of these methods assume that A is known and that the "data" are deterministic or that ( | ) Var Y X x = is known and independent of x .…”
Section: Review Of Related Mathematics and Statistics Literaturementioning
confidence: 99%