2007
DOI: 10.1109/fuzzy.2007.4295660
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Objective Evolutionary Fuzzy Clustering for High-Dimensional Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 36 publications
(36 reference statements)
0
12
0
Order By: Relevance
“…{feature 6} ⊂ {feature 6, feature 9, feature 13} ⊂ {feature 6, feature 9, feature 10, feature 13}. (6) Here, the corresponding features are as follows: 6: average number of rooms per dwelling, 9: index of accessibility to radial highways, 13: percentage of lower status population, and 10: full-value property-tax rate per $10,000. This combination is quite convincing.…”
Section: Results Of the Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…{feature 6} ⊂ {feature 6, feature 9, feature 13} ⊂ {feature 6, feature 9, feature 10, feature 13}. (6) Here, the corresponding features are as follows: 6: average number of rooms per dwelling, 9: index of accessibility to radial highways, 13: percentage of lower status population, and 10: full-value property-tax rate per $10,000. This combination is quite convincing.…”
Section: Results Of the Experimentsmentioning
confidence: 99%
“…In addition, the main drawback is the fuzzy models' relative inefficiency as the size of the data increases, regarding both the number of data points in the data set and the number of features. Moreover, one of the most widely used approaches in fuzzy modeling is the fuzzy C-means (FCM) algorithm for constructing the antecedents of the rules associated with the curse of dimensionality [5,6].…”
Section: Introductionmentioning
confidence: 99%
“…In the earlier approach proposed in [9], Genetic Algorithm (GA) was used to simultaneously select the suitable data points and features for a classification problem. Recently, in [14] the researchers employed a simulated annealing approach to alternate instance selection and feature selection during each step of the search. Finally, more complex approaches combining the data and feature selection were developed in [10,11].…”
Section: A Feature and Instance Selectionmentioning
confidence: 99%
“…Similarly, many MOPs have a huge number of variables (more than 100 variables [30]); some examples are classification [31], clustering [32], recommendation systems [33], and so on. However, the goal of traditional MOEAs is to solve multi-objective small-scale optimization problems (MOSSOPs); consequently, the traditional algorithms may be incapable of tackling multiobjective large-scale optimization problems (MOLSOPs) because of the "curse of dimensionality".…”
Section: Introductionmentioning
confidence: 99%