Proceedings of the 2002 American Control Conference (IEEE Cat. No.CH37301) 2002
DOI: 10.1109/acc.2002.1023141
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of different growing radial basis functions algorithms for control systems applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 13 publications
0
6
0
Order By: Relevance
“…Growing RBF-NN algorithm is one of the approaches to address this problem. The main advantage of this architecture is that their dimensionality is not predefined but grows incrementally along with the complexity of the model (Fravolini et al, 2002). This algorithm was used in the current application.…”
Section: Neural Network (Nn) Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…Growing RBF-NN algorithm is one of the approaches to address this problem. The main advantage of this architecture is that their dimensionality is not predefined but grows incrementally along with the complexity of the model (Fravolini et al, 2002). This algorithm was used in the current application.…”
Section: Neural Network (Nn) Classificationmentioning
confidence: 99%
“…This feature is realized by the training process of the network, which iteratively updates the centroid vector by the error information (Fravolini, Campa, Napolitano, & La Cava, 2002). After training, the RBF-NN is able to perform the classification.…”
Section: Background Of Neural Network (Nn)mentioning
confidence: 99%
“…Essentially, the EMRAN algorithm allocates neurons in order to decrease the estimation error in regions of the state space where the mapping accuracy is poor. This strategy results in a significant reduction of the number of parameters to be updated on-line, thus reducing the computational burden, and therefore making this architecture particularly suitable for on-line applications [24][25][26][27] . For Gaussian basis functions, the output of each neuron is computed with the expression: …”
Section: B Extended Minimal Resource Allocating Network (Emran)mentioning
confidence: 99%
“…Therefore, the concept of growing neurons is introduced. In this idea, a small number of neurons are assigned to the controller, then, according to the criteria of growing neurons, 3,18 the number of neurons is increased automatically to meet the suitable performance. Criteria for adding one neuron: One new neuron will be added into the current network as long as all of these following criteria are satisfied: (The sub-index i = 1, 2 indicates for different RBF neural networks W T i ϕ i ) (1) Current estimation error criteria:…”
Section: Ivb Discretization and Growing Neuron Mechanismmentioning
confidence: 99%
“…In Growing Radial Basis Function Neural Networks (GRBFNN), the number of nodes is optimized as well. Mario and his colleagues 3 published a paper that compared and summarized different growing radial basis function (GRBF) algorithms for control system applications. In the textbook, 18 they present a systematic approach for GRBF neural networks control of nonlinear systems utilizing an extended Kalman Filter.…”
Section: Introductionmentioning
confidence: 99%