2008
DOI: 10.1007/978-1-60327-101-1_3
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Regularization of Neural Networks

Abstract: Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a "well-posed" statistical problem in the manner of a ridge regression. The advantage of BRANNs is that the models are robust and the validation process, which scales as O(N2) in normal regression methods, such as back propagation, is unnecessary.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
313
0
2

Year Published

2015
2015
2022
2022

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 443 publications
(340 citation statements)
references
References 19 publications
1
313
0
2
Order By: Relevance
“…In many studies [8,[31][32][33], the BR training algorithm has given either moderate or the best performance in terms of comparison with other training algorithms. BRANNs have some important advantages, such as choice and robustness of model, choice of validation set, size of validation effort, and optimization of network architecture [13]. Bayesian methods can solve the overfitting problem effectively and complex models are penalized in the Bayesian approach.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In many studies [8,[31][32][33], the BR training algorithm has given either moderate or the best performance in terms of comparison with other training algorithms. BRANNs have some important advantages, such as choice and robustness of model, choice of validation set, size of validation effort, and optimization of network architecture [13]. Bayesian methods can solve the overfitting problem effectively and complex models are penalized in the Bayesian approach.…”
Section: Discussionmentioning
confidence: 99%
“…LM was especially developed for faster convergence in backpropagation algorithms. Essentially, BR has an objective function that includes a residual sum of squares and the sum of squared weights to minimize estimation errors and to achieve a good generalized model [3,[12][13][14][15].…”
Section: Introductionmentioning
confidence: 99%
“…The mathematics of Bayesian regularization is challenging and is not repeated here as it is described in numerous publications. [7][8][9][11][12][13][14][15][44][45][46][47] …”
Section: Sparse Learning Methodsmentioning
confidence: 99%
“…The processes of building model include training, validating, and testing processes. In this study, the Levenberg-Marquardt algorithm is employed for training and the Bayesian Regularization method is used to avoid overtraining [15][16][17]. During the training process, weights (w ji , w kj ) and biases (b j , b k ) are updated at each iteration to minimize the error between target (observation) and output.…”
Section: Artificial Neural Networkmentioning
confidence: 99%