2004
DOI: 10.1007/s00521-004-0451-y
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian selective combination of multiple neural networks for improving long-range predictions in nonlinear process modelling

Abstract: A Bayesian selective combination method is proposed for combining multiple neural networks in nonlinear dynamic process modelling. Instead of using fixed combination weights, the probability of a particular network being the true model is used as the combination weight for combining that network. The prior probability is calculated using the sum of squared errors of individual networks on a sliding window covering the most recent sampling times. A nearest neighbour method is used for estimating the network err… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
20
0

Year Published

2006
2006
2023
2023

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 31 publications
(20 citation statements)
references
References 21 publications
0
20
0
Order By: Relevance
“…Nonlinear combination methods include Dempster-Shafer belief based method, rank-based information, majority voting, order statistic and Tumer and Ghosh. [35] Among these nonlinear methods, Dempster-Shafer seems to be the most renowned method. The Demspter-Shafer belief based method is somewhat complex and it has to deal with the uncertainty and ignorance of the classifiers.…”
Section: Linear and Nonlinear Combination Of Multiple Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…Nonlinear combination methods include Dempster-Shafer belief based method, rank-based information, majority voting, order statistic and Tumer and Ghosh. [35] Among these nonlinear methods, Dempster-Shafer seems to be the most renowned method. The Demspter-Shafer belief based method is somewhat complex and it has to deal with the uncertainty and ignorance of the classifiers.…”
Section: Linear and Nonlinear Combination Of Multiple Neural Networkmentioning
confidence: 99%
“…Unlike PCR or MLR, the aggregating weight for a particular network is selected as the posterior probability that the network is a true model with the posterior probability of the entire networks sum to 1. The individual network model and aggregated neural network model are represented respectively as follows [35] :…”
Section: Linear and Nonlinear Combination Of Multiple Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Hierarchical RBF network has been proved effective in the reconstruction of smooth surfaces from sparse noisy data points [5]. In order to improve the model generalization performance, a selective combination of multiple neural networks by using Bayesian method was proposed in [6].…”
Section: Introductionmentioning
confidence: 99%
“…HiRBF has been proved effective in the reconstruction of smooth surfaces from sparse noisy data points [18]. In order to improve the model generalization performance, a selective combination of multiple neural networks by using Bayesian method was proposed in [19].…”
Section: Introductionmentioning
confidence: 99%