1996
DOI: 10.1016/0098-1354(95)00013-r
|View full text |Cite
|
Sign up to set email alerts
|

Cross-validated structure selection for neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0

Year Published

1998
1998
2010
2010

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 56 publications
(30 citation statements)
references
References 7 publications
0
30
0
Order By: Relevance
“…With weight decay, an additional term is added to the error function that is proportional to the sizes of the weights associated with each factor entering the models. Early stopping is a popular alternative to weight decay that is often employed when the number of parameters/number of samples ratio is significantly greater than unity (3,20,47,54,60). Early stopping is a nonconvergent technique that terminates training before the ANN is finished fitting the training data.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…With weight decay, an additional term is added to the error function that is proportional to the sizes of the weights associated with each factor entering the models. Early stopping is a popular alternative to weight decay that is often employed when the number of parameters/number of samples ratio is significantly greater than unity (3,20,47,54,60). Early stopping is a nonconvergent technique that terminates training before the ANN is finished fitting the training data.…”
Section: Discussionmentioning
confidence: 99%
“…For this study, the values of ␣ selected for nirS, nirK, dsrAB 1 , and dsrAB 2 were 1.0, 0.35, 1.0, and 0.001, respectively. K-fold cross-validation is a well-established method of using an entire data set for both training and testing (7,54,58). We performed onefold, also known as leave-one-out, cross-validation in which one sample was withheld from training and used to test the model fitted to the remaining data.…”
Section: Molecular Methodsmentioning
confidence: 99%
“…The leave-one-out cross-validation procedure [36] was adopted to test the performance of the network in a reliable manner, taking into account the limited number of cases available in the classes, and in the same time achieving an acceptable generalization in the classification and avoiding overtraining.…”
mentioning
confidence: 99%
“…In a comparison of CV with two other MLP architecture selection strategies in a recent paper [20] CV was found to be the best at choosing the optimal network architecture, at least on the data sets tested. However, the comparison was based on only a single type of artificial data and did not look at any real world problem domains.…”
Section: Cross Validation (Cv)mentioning
confidence: 99%