2012
DOI: 10.1016/j.apm.2011.09.066
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid algorithm to optimize RBF network architecture and parameters for nonlinear time series prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
20
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(20 citation statements)
references
References 27 publications
0
20
0
Order By: Relevance
“…The parameters of model can be optimized simultaneously by LMM and LSM. The SNPOM is one of these method which divides search space into two subspaces [33,34]. The SNPOM can accelerate the computational convergence of the parameter optimization in RBF network.…”
Section: The Proposed Algorithmmentioning
confidence: 99%
“…The parameters of model can be optimized simultaneously by LMM and LSM. The SNPOM is one of these method which divides search space into two subspaces [33,34]. The SNPOM can accelerate the computational convergence of the parameter optimization in RBF network.…”
Section: The Proposed Algorithmmentioning
confidence: 99%
“…An optimization algorithm will attempt to find an optimal choice that satisfies defined constraints and make an optimization criterion (performance or cost index) maximize or minimize [38,40]. Hence, to improve the prediction accuracy and robustness of the RBF network, network parameters (centers, widths and weights) should be simultaneously tuned [32]. Some of the existing algorithms to achieve that are given in [32,36,41,42,43].…”
Section: Introductionmentioning
confidence: 99%
“…RBF networks have many remarkable characteristics, such as simple network structure, strong learning capacity, better approximation capacities and fast learning speed. The difficulty of applying the RBF networks is in network training which should select and estimate properly the input parameters including centers and widths of the basis functions and the neuron connection weights [32,36,37]. In order to find the most appropriate parameters, an optimization algorithm can be used [38,39].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations