1986
DOI: 10.1080/00207178608933575
|View full text |Cite
|
Sign up to set email alerts
|

Model-structure selection by cross-validation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
39
0
1

Year Published

1992
1992
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 105 publications
(41 citation statements)
references
References 52 publications
0
39
0
1
Order By: Relevance
“…Model selection criteria are often established on the basis of estimates of prediction errors, by inspecting how the identified model performs on future (never used) data sets. One general routine for model selection, which tries to avoid or reduce any possible bias introduced by relying on any particular test data sets, is cross validation (Stone 1974, Stoica et al 1986). Cross-validation has a number of variations, two commonly used variants of which are the leave-one-out (LOO), also called predicted sum of squares (PRESS) (Allen 1974), and generalised cross-validation (GCV) (Craven andWahba 1979, Golub et al 1979).…”
Section: Model Length Determinationmentioning
confidence: 99%
“…Model selection criteria are often established on the basis of estimates of prediction errors, by inspecting how the identified model performs on future (never used) data sets. One general routine for model selection, which tries to avoid or reduce any possible bias introduced by relying on any particular test data sets, is cross validation (Stone 1974, Stoica et al 1986). Cross-validation has a number of variations, two commonly used variants of which are the leave-one-out (LOO), also called predicted sum of squares (PRESS) (Allen 1974), and generalised cross-validation (GCV) (Craven andWahba 1979, Golub et al 1979).…”
Section: Model Length Determinationmentioning
confidence: 99%
“…This, however, would require a priori knowledge of the process. However, a simpler approach relies on the incorporation of the cross-validation principle [65,64] to automate this selection. In relation to PCA, cross-validation has been proposed as a technique to determine the number of retained principal components by Wold [79] and Krzanowski [39].…”
Section: Disjunct Regionsmentioning
confidence: 99%
“…To the best of our knowledge no such criterion has been rigorously developed, and adaptation of existing solutions that were derived for 1-D data models may be misleading. Stoica et al, [22] proposed a cross-validation selection rule and demonstrated its asymptotic equivalence to the Generalized Akaike Information Criterion (GAIC). The suggested criterion is not derived for any specific model.…”
Section: Introductionmentioning
confidence: 99%