2013
DOI: 10.1109/tnnls.2012.2227794
|View full text |Cite
|
Sign up to set email alerts
|

Radial Basis Function Network Training Using a Nonsymmetric Partition of the Input Space and Particle Swarm Optimization

Abstract: This paper presents a novel algorithm for training radial basis function (RBF) networks, in order to produce models with increased accuracy and parsimony. The proposed methodology is based on a nonsymmetric variant of the fuzzy means (FM) algorithm, which has the ability to determine the number and locations of the hidden-node RBF centers, whereas the synaptic weights are calculated using linear regression. Taking advantage of the short computational times required by the FM algorithm, we wrap a particle swarm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
56
0
5

Year Published

2013
2013
2018
2018

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 112 publications
(61 citation statements)
references
References 52 publications
0
56
0
5
Order By: Relevance
“…However, there are abundant of proposals developed with their own merits and demerits such as constructive decay [29], resource allocating networks [55], and the minimum description length principle [56]. Recently, Alexandridis et al, [57], introduced a nonsymmetric approach for partitioning the input space. Their experimental outcomes have shown that the nonsymmetric partition can lead to the development of more accurate RBF models, with a smaller number of hidden layer nodes.…”
Section: Orthogonal Least Squares (Ols)mentioning
confidence: 99%
“…However, there are abundant of proposals developed with their own merits and demerits such as constructive decay [29], resource allocating networks [55], and the minimum description length principle [56]. Recently, Alexandridis et al, [57], introduced a nonsymmetric approach for partitioning the input space. Their experimental outcomes have shown that the nonsymmetric partition can lead to the development of more accurate RBF models, with a smaller number of hidden layer nodes.…”
Section: Orthogonal Least Squares (Ols)mentioning
confidence: 99%
“…Radial basis function (RBF) neural networks due to their simple architecture and computational efficiency have attracted the attention of many researchers [19][20][21][22], and particle swarm optimization (PSO) is a global optimization strategy for solving continuous and discrete optimization problems [23][24][25][26][27]. The hybrid learning algorithm of the PSO-based RBF neural network has also been used for many applications [28][29][30][31]. In this paper, the global optimum parameter of the RBF is determined by means of PSO, thus eliminating the chattering or steadystate error happened in RNFCSMC.…”
Section: Introductionmentioning
confidence: 99%
“…The search rules implemented by the PSO algorithm are simpler than those encountered in the GA and the AIA, and PSO algorithm is easy to implement and exhibits quick convergence. In light of these advantages, recently the PSO algorithm has attracted a lot of attention as an effective optimization tool [11,21,23]. (3) Given the kernel matrix problem present in SVR, some researchers employ data clustering and data pruning methods to reduce computing time, such as K-Means clustering [12], Fuzzy C-Means [13], and clustering kernel row vectors [24].…”
Section: Introductionmentioning
confidence: 99%
“…Many researchers have pointed out that three crucial problems existing in SVR urgently need to be addressed: (1) How to choose or construct an appropriate kernel to complete forecasting problems [8,9]; (2) How to optimize parameters of SVR to improve the quality of prediction [10,11]; (3) How to construct a fast algorithm to operate in presence of large datasets [12,13]. With unsuitable kernel functions or hyperparameter settings, SVR may lead to poor prediction results.…”
Section: Introductionmentioning
confidence: 99%