2004
DOI: 10.1016/s0165-0114(03)00112-x
|View full text |Cite
|
Sign up to set email alerts
|

A fast genetic method for inducting descriptive fuzzy models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 16 publications
0
10
0
Order By: Relevance
“…The method is accurate and has been shown to be as fast as some ad-hoc learning methods (Sánchez and Otero 2004). On the other side, when the datasets are imprecise the backfitting results may not be appropriate, as we explain in the next section.…”
Section: Backfitting In Crisp Datasetsmentioning
confidence: 99%
See 2 more Smart Citations
“…The method is accurate and has been shown to be as fast as some ad-hoc learning methods (Sánchez and Otero 2004). On the other side, when the datasets are imprecise the backfitting results may not be appropriate, as we explain in the next section.…”
Section: Backfitting In Crisp Datasetsmentioning
confidence: 99%
“…The fuzzy rule learning algorithms in Sánchez et al (2006) are: Wang and Mendel (1992) with importance degrees 'maximum', 'mean' and 'product of maximum and mean' (WM1, WM2 and WM3, respectively) the same three versions of Cordón and Herrera's method (Cordón and Herrera 2000) (CH1, CH2, CH3). Nozaki, Ishibuchi and Tanaka's fuzzy rule learning (Nozaki et al 1997) (NIT), TSK rules (Takagi and Sugeno 1985) optimized with Weighted Least Squares and Genetic Backfitting (Sánchez and Otero 2004), and MOSA-based backfitting (Sánchez and Villar 2008;Sánchez et al 2006) (BMO). The same reference includes Linear Regression (LIN), Quadratic Regression (QUA) and a Conjugate-Gradient trained Multilayer Perceptron (NEU).…”
Section: Influence Of the Stochastic Noisementioning
confidence: 99%
See 1 more Smart Citation
“…In other publications, 4,5 following the work of Friedman et al, 6 Adaboost is regarded as a forward stepwise estimation of the statistical parameters defining a logit transform of a Generalized Additive Model, and this property is used to extend this last estimation to learn fuzzy models in regression problems. A similar statistical interpretation was used later to improve the fuzzy Adaboost algorithm, again in classification problems.…”
Section: Introductionmentioning
confidence: 99%
“…Neuro-fuzzy systems are one of the most successful and visible directions of the efforts [2]. A different approach to hybridization leads to evolutionary fuzzy systems (EFSs) [1,[3][4][5][6]. EFSs are basically fuzzy systems augmented by learning processes based on evolutionary algorithms, such as the genetic algorithm (GA), evolutionary programming (EP), evolutionary strategies (ES), particle swarm optimization (PSO) [7], and any hybridization form of the above evolutionary algorithms.…”
Section: Introductionmentioning
confidence: 99%