2011 5th International Symposium on Computational Intelligence and Intelligent Informatics (ISCIII) 2011
DOI: 10.1109/isciii.2011.6069747
|View full text |Cite
|
Sign up to set email alerts
|

Distributed MOPSO with a new population subdivision technique for the feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 12 publications
0
9
0
Order By: Relevance
“…Some literature contributions focus on the use of selection techniques embedded in the optimization procedure in order to find out only one optimal solution in multi‐objective contexts. To this purpose, Fdhila et al () use subswarms by dynamically subdividing the population in a MOPSO using Pareto front. In addition, Mattson and Messac () compare conceptual design alternatives by means of Pareto frontiers for each set of concepts.…”
Section: Discussionmentioning
confidence: 99%
“…Some literature contributions focus on the use of selection techniques embedded in the optimization procedure in order to find out only one optimal solution in multi‐objective contexts. To this purpose, Fdhila et al () use subswarms by dynamically subdividing the population in a MOPSO using Pareto front. In addition, Mattson and Messac () compare conceptual design alternatives by means of Pareto frontiers for each set of concepts.…”
Section: Discussionmentioning
confidence: 99%
“…The proposed algorithm further reduced the number of features and improved the classification performance over the previous method [140] and standard PSO. PSO with multiple swarms to share experience has also been applied to feature selection [11], [180], but may lead to the problem of high computational cost.…”
Section: Table III Categorisation Of Pso Approaches Single Objectivementioning
confidence: 99%
“…For wrapper approaches, many existing works used only the classification performance as the fitness function [11], [134], [135], [137], [138], [139], [140], [142], [160], which led to relatively large feature subsets. However, most of the fitness functions used different ways to combine both the classification performance and the number of features into a single fitness function [70], [136], [141], [180], [148], [181]. However, it is difficult to determine in advance the optimal balance between them without a priori knowledge.…”
Section: Table III Categorisation Of Pso Approaches Single Objectivementioning
confidence: 99%
“…However, it has not been compared with any wrapper algorithm, which can usually obtain higher classification performance than a filter algorithm. Fdhila et al [21] applied multi-swarm PSO to solve feature selection problems. However, the computational cost of the proposed algorithm is high because it involves parallel evolutionary processes and multiple sub-swarms with a relative large number of particles.…”
Section: B Entropy and Mutual Informationmentioning
confidence: 99%