2019
DOI: 10.1016/j.eswa.2019.06.044
|View full text |Cite
|
Sign up to set email alerts
|

Cost-sensitive feature selection using two-archive multi-objective artificial bee colony algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
55
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 163 publications
(55 citation statements)
references
References 57 publications
0
55
0
Order By: Relevance
“…Although, return cost binary firefly algorithm and a Pareto dominance technique was proposed in [47], and a set of public datasets was used for experiment. Recently in 2019, a multi-objective based artificial bee colony with two-archive (leader and external) was proposed with a purpose to minimize feature cost and maximize the classification accuracy [48]. The proposed method was compared against eight datasets with three multi-objective approaches and two traditional methods.…”
Section: Related Workmentioning
confidence: 99%
“…Although, return cost binary firefly algorithm and a Pareto dominance technique was proposed in [47], and a set of public datasets was used for experiment. Recently in 2019, a multi-objective based artificial bee colony with two-archive (leader and external) was proposed with a purpose to minimize feature cost and maximize the classification accuracy [48]. The proposed method was compared against eight datasets with three multi-objective approaches and two traditional methods.…”
Section: Related Workmentioning
confidence: 99%
“…The disadvantage in ABC is it suffers from premature convergence, thus, CS assist ABC to replace the not-so-good solutions into good solutions. Other than that, [21] has used two archive guided multi-objective artificial bee colony algorithm for cost-sensitive feature selection. Two archives which are leader and external archives are utilized to enhance the search capability of the algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…Various methods have been developed for the task of feature selection in the unsupervised setting. Most of existing works distinguish these algorithms into three groups, i.e., filter [2], [4], [10], wrapper and embedded approaches [11]- [13], in terms of different selection strategy. Moreover, with the absent of supervised information, one of the key problem for unsupervised feature selection is to design the appropriate criterion to guide the search of relevant and informative features.…”
Section: Related Work a Unsupervised Feature Selectionmentioning
confidence: 99%