Seventh International Conference on Intelligent Systems Design and Applications (ISDA 2007) 2007
DOI: 10.1109/isda.2007.4389688
|View full text |Cite
|
Sign up to set email alerts
|

Two-Step Particle Swarm Optimization to Solve the Feature Selection Problem

Abstract: In this paper we propose a new model of ParticleSwarm Optimization called Two-Step PSO. The basic idea is to split the heuristic search performed by particles into two stages. We have studied the performance of this new algorithm for the Feature Selection problem by using the reduct concept of the Rough Set Theory. Experimental results obtained show that the Two-step approach improves over the PSO model in calculating reducts, with the same computational cost.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
44
0
1

Year Published

2008
2008
2021
2021

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 30 publications
(45 citation statements)
references
References 23 publications
0
44
0
1
Order By: Relevance
“…In the second stage, only the top-ranked features are used as the candidate features for the wrapper algorithm. In [28] a PSO-based feature subset selection algorithm is proposed for the classification of high-dimensional cancer microarray data. In the first stage, the dataset is clustered by the k-means algorithm, then a filter algorithm is applied to rank each gene in every cluster.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In the second stage, only the top-ranked features are used as the candidate features for the wrapper algorithm. In [28] a PSO-based feature subset selection algorithm is proposed for the classification of high-dimensional cancer microarray data. In the first stage, the dataset is clustered by the k-means algorithm, then a filter algorithm is applied to rank each gene in every cluster.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The evaluation function of the mesh nodes is displayed in expression (2) and is the same used in [15] and [4]. It considers the number of attributes included in the feature subset R represented by node n and the quality of the classification associated to R. The goal is to maximize the function below.…”
Section: A Reduct Is a Minimal Subset Of Features B ⊆ A Such That In mentioning
confidence: 99%
“…The dynamic nature of our proposal manifests in the generation of (i) the initial mesh; During the mesh expansion in each cycle, a weight w is defined using expression (3) as in [14], [15] and [4].…”
Section: Generation Of Nodes In Dmomentioning
confidence: 99%
See 1 more Smart Citation
“…Another feature selection approach that based on Scatter Search (SSAR) is proposed by Jue et al in [18]. The Particle Swarm Optimization algorithm (PSO), proposed by Kennedy and Eberhart [19], was used in feature selection approaches as in [20][21][22][23]. Based on the biological behaviour of bees, Karaboga [24] proposed an optimisation approach called the Artificial Bee Colony (ABC) algorithm.…”
Section: Introductionmentioning
confidence: 99%