2017
DOI: 10.1016/j.swevo.2017.04.002
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid algorithm using ant and bee colony optimization for feature selection and classification (AC-ABC Hybrid)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
68
0
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 190 publications
(74 citation statements)
references
References 26 publications
0
68
0
1
Order By: Relevance
“…Fluctuation may occur when huge numbers of fireflies attract to light emission source and the searching process becomes time-consuming. To overcome these issues, neighborhood attraction FA (NaFA) is introduced, which shows that fireflies are just attracted to only some brighter points, which are outlined by previous neighbor [62].…”
Section: Firefly Algorithm (Fa)mentioning
confidence: 99%
“…Fluctuation may occur when huge numbers of fireflies attract to light emission source and the searching process becomes time-consuming. To overcome these issues, neighborhood attraction FA (NaFA) is introduced, which shows that fireflies are just attracted to only some brighter points, which are outlined by previous neighbor [62].…”
Section: Firefly Algorithm (Fa)mentioning
confidence: 99%
“…Ghosh et al [32] proposed an adaptive Differential Evolution (DE) algorithm for feature subset selection in hyperspectral image data, in which the parameters of DE are adjusted by the algorithm itself depending on the type of problem at hand. Shunmugapriya and Kanmani [33] proposed a hybrid algorithm which combines ACO and Artificial Bee Colony (ABC) algorithms for feature subset selection in classification, where each ant exploit by the bees to find the best ant of colony and each bee adapts their food source by the ants. Zorarpacı and Özel [34] proposed a hybrid algorithm that combines ABC and DE for feature subset selection in classification.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In the comparison, 17 algorithms are implemented and tested on Matlab 2015b, all based on the well-known k-NN classifier with k = 5. To show the performance of the proposed algorithm, we compare it with the standard GA, the original CSO algorithm [25], the original PSO, DE [32], ABC-DE [34], ACO-FS [31], ACO-ABC [33], GSA [56], BQIGSA [57], four variants of PSO proposed by Xue's for bi-objective feature subset selection [36] (Xue1-PSO, Xue2-PSO, Xue3-PSO, and Xue4-PSO), and three two-stage algorithms include 2S-GA [40], 2S-HGA [41], and 2S-PSO [39]. Based on Xue et al [36], the major difference between Xue's algorithms is the number of features selected in the initial swarm, while Xue1-PSO uses the normal initialization method where approximately half of the features are chosen in each particles, Xue2-PSO applies a little initialization method in which only about 10% features are chosen in each particles, Xue3-PSO applies heavy initialization method in which more than half (about 2/3) of the features are chosen in each particles, and Xue4-PSO applies a combined initialization in which a major (about 2/3) of the particles are initialized with the little initialization method, while the remaining particles of swarm are initialized with the heavy initialization method.…”
Section: Dataset Properties and Experimental Settingsmentioning
confidence: 99%
“…Feature selection can be categorized into filter and wrapper approaches. Filter approach makes use of the dependency, mutual information, distance, and information theory in feature selection [14]. Unlike filter, wrapper approach employs a classifier as the learning algorithm to optimize the classification performance by selecting the relevant features.…”
Section: Introductionmentioning
confidence: 99%