2007
DOI: 10.1016/j.neunet.2007.05.006
|View full text |Cite
|
Sign up to set email alerts
|

GFAM: Evolving Fuzzy ARTMAP neural networks

Abstract: This paper focuses on the evolution of Fuzzy ARTMAP neural network classifiers, using genetic algorithms, with the objective of improving generalization performance (classification accuracy of the ART network on unseen test data) and alleviating the ART category proliferation problem (the problem of creating more than necessary ART network categories to solve a classification problem). We refer to the resulting architecture as GFAM. We demonstrate through extensive experimentation that GFAM exhibits good gener… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
9
0

Year Published

2008
2008
2019
2019

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 35 publications
0
9
0
Order By: Relevance
“…It is also worth mentioning that the categories in FAM, EAM and GAM are allowed to expand up to a point allowed by a threshold, controlled by a network parameter denoted as the baseline vigilance parameter (ρ a ). This parameter assumes values in the interval [0,1]. Small values of this parameter allow the creation of large categories, while large values of this parameter allow the creation of small categories.…”
Section: The Artmap Architecturesmentioning
confidence: 99%
See 2 more Smart Citations
“…It is also worth mentioning that the categories in FAM, EAM and GAM are allowed to expand up to a point allowed by a threshold, controlled by a network parameter denoted as the baseline vigilance parameter (ρ a ). This parameter assumes values in the interval [0,1]. Small values of this parameter allow the creation of large categories, while large values of this parameter allow the creation of small categories.…”
Section: The Artmap Architecturesmentioning
confidence: 99%
“…ENETIC ARTMAP (GART) architectures were introduced in [1] and [2]. It was shown that the these architectures were able to achieve very competitive classification accuracy and exceptionally small-size classifiers when compared to other classifiers in the literature (see [3], [4]).…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…To alleviate these problems, we introduced genetic fuzzy ARTMAP (GFAM) in [5] and [6]. GFAM uses a genetic algorithm (GA) (see [7]) to evolve simultaneously the weights, as well as the topology of ART neural networks.…”
mentioning
confidence: 99%
“…GFAM uses a genetic algorithm (GA) (see [7]) to evolve simultaneously the weights, as well as the topology of ART neural networks. In [6], we extended the ideas of genetically engineering FAM (GFAM, in [5]) to EAM and GAM, and introduced several improvements, which resulted in significant gains in terms of efficiency.…”
mentioning
confidence: 99%