2016
DOI: 10.12928/telkomnika.v14i3a.4395
|View full text |Cite
|
Sign up to set email alerts
|

Improvement of RBF Neural Network by AdaBoost Algorithm Combined with PSO

Abstract: The traditional RBF neural network has the problem of slow training speed and low efficiency, this paper puts forward the algorithm of improvement of RBF neural network by AdaBoost algorithm combined with PSO, to IntroductionRBF neural network is a local optimal approximation network, with good generalization ability, classification ability and nonlinear mapping ability, simple structure, as well as fast convergence speed [1], and this network is often used in the areas, such as nonlinear system modeling, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…The steps of the Adaboost algorithm are [23] to make predictions using the last model. Therefore, the core of the iterative AdaBoost process is iteratively AdaBoost by in circles, updating the sample to find the best weak classifier distribution at the moment, and then calculate the error rate of each weak classifier, and finally build a weak classifier into a strong classifier several times [24].…”
Section: Adaboost Ensemblementioning
confidence: 99%
“…The steps of the Adaboost algorithm are [23] to make predictions using the last model. Therefore, the core of the iterative AdaBoost process is iteratively AdaBoost by in circles, updating the sample to find the best weak classifier distribution at the moment, and then calculate the error rate of each weak classifier, and finally build a weak classifier into a strong classifier several times [24].…”
Section: Adaboost Ensemblementioning
confidence: 99%