2009
DOI: 10.1016/j.asoc.2009.05.006
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid learning algorithm with a similarity-based pruning strategy for self-adaptive neuro-fuzzy systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(13 citation statements)
references
References 27 publications
0
13
0
Order By: Relevance
“…Table 1 lists results obtained by five methods: the traditional ANFIS (training by gradient descent back-propagation algorithm), ANFIS-EM-EKS, an adaptive neural network (ANN) trained by a scaled conjugate gradient algorithm, support vector machines (SVM) using Gaussian Fig. 3 Deviations in rule regions: a the grid partition of a two-dimensional input space, b the higher deviations of r jk , c the lower deviations of r jk , d the lower deviations of l jk , e the higher deviations of l jk kernels, and the self-adaptive neuro-fuzzy system (SANFS) [36]. The RMSE and APE values obtained by these methods are given in Table 1, where ANFIS-EM-EKS produces the lowest RMSE and APE values and allows rapid convergence to local optimal values.…”
Section: Two-input Nonlinear Sinc Function Approximationmentioning
confidence: 99%
“…Table 1 lists results obtained by five methods: the traditional ANFIS (training by gradient descent back-propagation algorithm), ANFIS-EM-EKS, an adaptive neural network (ANN) trained by a scaled conjugate gradient algorithm, support vector machines (SVM) using Gaussian Fig. 3 Deviations in rule regions: a the grid partition of a two-dimensional input space, b the higher deviations of r jk , c the lower deviations of r jk , d the lower deviations of l jk , e the higher deviations of l jk kernels, and the self-adaptive neuro-fuzzy system (SANFS) [36]. The RMSE and APE values obtained by these methods are given in Table 1, where ANFIS-EM-EKS produces the lowest RMSE and APE values and allows rapid convergence to local optimal values.…”
Section: Two-input Nonlinear Sinc Function Approximationmentioning
confidence: 99%
“…through local element tuning. Leng et al proposed an algorithm for the generation of Takagi-Sugenotype (TS) neuro fuzzy systems 20 . The algorithm consists of two stages: in the first stage, an initial structure is adapted from an empty neuron or fuzzy rule set, based on the geometric growth criteron and the ε-completeness of fuzzy rules; then, in the second stage, the initial structure is refined through a hybrid learning algorithm.…”
Section: The State Of the Art Of Constructive Methodsmentioning
confidence: 99%
“…The training samples of the first synthetic experiment 32 , consists of 216 uniformly sampled three-input data from input ranges In the third synthetic experiment 20 , we tested our approach with the well-known Mackey Glass Timeseries, with the same experimental configuration. The training data are 1000 input-target data generated between t = 124 and 1123, and another 1000 input-target data between t = 1124 and 2123.…”
Section: Datasetsmentioning
confidence: 99%
“…more recent approach is the inference model based on a combination of adaptive neural networks and fuzzy logic (Leng et al, 2009).…”
Section: Accepted Manuscriptmentioning
confidence: 99%