1992
DOI: 10.1109/72.165600
|View full text |Cite
|
Sign up to set email alerts
|

Improving generalization performance using double backpropagation

Abstract: In order to generalize from a training set to a test set, it is desirable that small changes in the input space of a pattern do not change the output components. This can be done by forcing this behavior as part of the training algorithm. This is done in double backpropagation by forming an energy function that is the sum of the normal energy term found in backpropagation and an additional term that is a function of the Jacobian. Significant improvement is shown with different architectures and different test … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
127
0

Year Published

1994
1994
2011
2011

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 202 publications
(134 citation statements)
references
References 4 publications
0
127
0
Order By: Relevance
“…This may be partially attributed to the objective function that induces overemphasized high-flow events, higher nonlinearity inherent in extreme flow events, and stringent requirements, such as normality and independence about the errors. To improve training and/or generalization performance, a number of alternative accuracy measures have been investigated including the mean squared error (MSE), the mean absolute percentage error (MAPE), the median absolute percentage error (MdAPE), the geometric mean relative absolute error (GMRAE), and the AIC and BIC (Drucker & Cun, 1992;Liano, 1996;Ooyen & Nienhuis, 1992). In the context of RR modelling, Dawson & Wilby (1998) used the mean squared relative error (MSRE) and the root mean squared error (RMSE) as performance criteria in the application of ANN to flow forecasting in two flood-prone UK catchments.…”
Section: Modification Of the Training Objective Functionmentioning
confidence: 99%
“…This may be partially attributed to the objective function that induces overemphasized high-flow events, higher nonlinearity inherent in extreme flow events, and stringent requirements, such as normality and independence about the errors. To improve training and/or generalization performance, a number of alternative accuracy measures have been investigated including the mean squared error (MSE), the mean absolute percentage error (MAPE), the median absolute percentage error (MdAPE), the geometric mean relative absolute error (GMRAE), and the AIC and BIC (Drucker & Cun, 1992;Liano, 1996;Ooyen & Nienhuis, 1992). In the context of RR modelling, Dawson & Wilby (1998) used the mean squared relative error (MSRE) and the root mean squared error (RMSE) as performance criteria in the application of ANN to flow forecasting in two flood-prone UK catchments.…”
Section: Modification Of the Training Objective Functionmentioning
confidence: 99%
“…This database was then carved into two subsets of 600 samples each -the first five samples of each numeral from every writer were used for training and the rest for testing. Guyon et al [3], and Druker and Le Cun [2], both have reported a generalization performance of 97% with a 256:20:10 network trained using conventional BP, and a 256:40:10 network trained using 'double backpropagation', respectively. For the simulations presented here, the 256 element matrices were transformed into 32 element vectors by summing the rows and columns.…”
Section: B Handwritten Numeral Recognitionmentioning
confidence: 99%
“…For the simulations presented here, the 256 element matrices were transformed into 32 element vectors by summing the rows and columns. Although this 8-fold reduction in input dimension did cause a 4% reduction in generalization performance (compared with [2], [3]), it made the running of many more simulations possible due to the reduced memory and CPU requirements. Each element of these 32-element vectors was then transformed as…”
Section: B Handwritten Numeral Recognitionmentioning
confidence: 99%
See 2 more Smart Citations