2022
DOI: 10.1142/s0219622022500432
|View full text |Cite
|
Sign up to set email alerts
|

BFRA: A New Binary Hyper-Heuristics Feature Ranks Algorithm for Feature Selection in High-Dimensional Classification Data

Abstract: Feature selection is one of the main issues in machine learning algorithms. In this paper, a new binary hyper-heuristics feature ranks algorithm is designed to solve the feature selection problem in high-dimensional classification data called the BFRA algorithm. The initial strong population generation is done by ranking the features based on the initial Laplacian Score (ILR) method. A new operator called AHWF removes the zero-importance or redundant features from the population-based solutions. Another new op… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(10 citation statements)
references
References 52 publications
0
10
0
Order By: Relevance
“…The outcomes of functions with fixed dimensions are presented in Table (7). The results obtained from the tests of fixed dimension functions are shown in Table (7). The AVOA-SSA algorithm performed excellently in all functions except the F20 function, according to the results displayed.…”
Section: Results and Discussion For Global Optimizationmentioning
confidence: 89%
See 3 more Smart Citations
“…The outcomes of functions with fixed dimensions are presented in Table (7). The results obtained from the tests of fixed dimension functions are shown in Table (7). The AVOA-SSA algorithm performed excellently in all functions except the F20 function, according to the results displayed.…”
Section: Results and Discussion For Global Optimizationmentioning
confidence: 89%
“…The AVOA-SSA algorithm had an excellent performance in the NIKKEI index. Figure (7) shows that the AVOA-SSA algorithm worked well in the early epochs and correctly estimated almost all the areas. It can be seen even in the range of 380 to 410.…”
Section: Results and Discussion For Stock Market Predictionmentioning
confidence: 90%
See 2 more Smart Citations
“…It combines precision and recall into a single value and is particularly useful for balancing these metrics. This metric has been used in [22,33,35,39,45,48,51,55,59,60,83,92,98,101,117,132,141,149,155,162,164,167,171,[176][177][178]182,184,185] and mathematically is defined as follows:…”
Section: Classifier Metricsmentioning
confidence: 99%