“…(b) Error rate: Defined in detail in Section 4.2.1 and mathematically in equation 26. This objective function has been pursued in [42][43][44][45][46][47][48][49][50][51].…”
Section: Bibliometric Analysismentioning
confidence: 99%
“…The authors report the average number of features selected in the performed runs and their standard deviation. This metric has been used in [21,27,28,32,34,[36][37][38]40,42,[44][45][46][47][48][49][50][51][53][54][55]57,58,[63][64][65][66][67][68][69][70][71][72]75,77,78,[81][82][83][84][85][86][87][88][89][90][91][92][93][94][95]…”
Section: Feature Metricsmentioning
confidence: 99%
“…Discriminant analysis (DA), employed to classify observations into predefined classes based on their features, is discussed in one study [32]. Fuzzy classifiers (FCs), which apply fuzzy logic to handle ambiguous class memberships, are utilized in another work [47]. Additionally, latent Dirichlet allocation (LDA) is used to model latent topics within text corpora, as demonstrated in two articles [73,174].…”
Section: Classifier Categoriesmentioning
confidence: 99%
“…It is more exhaustive and aims to find the optimal binarization method for specific scenarios. Cases of this approach are the articles [47,71,78,89,91,92,96,106,107,110,114,124,129,131,132,138,142,144,146,154,165,172,176,179].…”
mentioning
confidence: 99%
“…• Fuzzy learning: While this has been used less frequently, with only four instances over the years [36,44,47,53], it offers a unique approach to handling uncertainties and improving adaptability in metaheuristics.…”
Feature selection is becoming a relevant problem within the field of machine learning. The feature selection problem focuses on the selection of the small, necessary, and sufficient subset of features that represent the general set of features, eliminating redundant and irrelevant information. Given the importance of the topic, in recent years there has been a boom in the study of the problem, generating a large number of related investigations. Given this, this work analyzes 161 articles published between 2019 and 2023 (20 April 2023), emphasizing the formulation of the problem and performance measures, and proposing classifications for the objective functions and evaluation metrics. Furthermore, an in-depth description and analysis of metaheuristics, benchmark datasets, and practical real-world applications are presented. Finally, in light of recent advances, this review paper provides future research opportunities.
“…(b) Error rate: Defined in detail in Section 4.2.1 and mathematically in equation 26. This objective function has been pursued in [42][43][44][45][46][47][48][49][50][51].…”
Section: Bibliometric Analysismentioning
confidence: 99%
“…The authors report the average number of features selected in the performed runs and their standard deviation. This metric has been used in [21,27,28,32,34,[36][37][38]40,42,[44][45][46][47][48][49][50][51][53][54][55]57,58,[63][64][65][66][67][68][69][70][71][72]75,77,78,[81][82][83][84][85][86][87][88][89][90][91][92][93][94][95]…”
Section: Feature Metricsmentioning
confidence: 99%
“…Discriminant analysis (DA), employed to classify observations into predefined classes based on their features, is discussed in one study [32]. Fuzzy classifiers (FCs), which apply fuzzy logic to handle ambiguous class memberships, are utilized in another work [47]. Additionally, latent Dirichlet allocation (LDA) is used to model latent topics within text corpora, as demonstrated in two articles [73,174].…”
Section: Classifier Categoriesmentioning
confidence: 99%
“…It is more exhaustive and aims to find the optimal binarization method for specific scenarios. Cases of this approach are the articles [47,71,78,89,91,92,96,106,107,110,114,124,129,131,132,138,142,144,146,154,165,172,176,179].…”
mentioning
confidence: 99%
“…• Fuzzy learning: While this has been used less frequently, with only four instances over the years [36,44,47,53], it offers a unique approach to handling uncertainties and improving adaptability in metaheuristics.…”
Feature selection is becoming a relevant problem within the field of machine learning. The feature selection problem focuses on the selection of the small, necessary, and sufficient subset of features that represent the general set of features, eliminating redundant and irrelevant information. Given the importance of the topic, in recent years there has been a boom in the study of the problem, generating a large number of related investigations. Given this, this work analyzes 161 articles published between 2019 and 2023 (20 April 2023), emphasizing the formulation of the problem and performance measures, and proposing classifications for the objective functions and evaluation metrics. Furthermore, an in-depth description and analysis of metaheuristics, benchmark datasets, and practical real-world applications are presented. Finally, in light of recent advances, this review paper provides future research opportunities.
The application of optimization theory and the algorithms that are generated from it has increased along with science and technology's continued advancement. Numerous issues in daily life can be categorized as combinatorial optimization issues. Swarm intelligence optimization algorithms have been successful in machine learning, process control, and engineering prediction throughout the years and have been shown to be efficient in handling combinatorial optimization issues. An intelligent optimization system called the chicken swarm optimization algorithm (CSO) mimics the organic behavior of flocks of chickens. In the benchmark problem's optimization process as the objective function, it outperforms several popular intelligent optimization methods like PSO. The concept and advancement of the flock optimization algorithm, the comparison with other meta-heuristic algorithms, and the development trend are reviewed in order to further enhance the search performance of the algorithm and quicken the research and application process of the algorithm. The fundamental algorithm model is first described, and the enhanced chicken swarm optimization algorithm based on algorithm parameters, chaos and quantum optimization, learning strategy, and population diversity is then categorized and summarized using both domestic and international literature. The use of group optimization algorithms in the areas of feature extraction, image processing, robotic engineering, wireless sensor networks, and power. Second, it is evaluated in terms of benefits, drawbacks, and application in comparison to other meta-heuristic algorithms. Finally, the direction of flock optimization algorithm research and development is anticipated.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.