2021
DOI: 10.1109/access.2021.3052149
|View full text |Cite
|
Sign up to set email alerts
|

Boosted Whale Optimization Algorithm With Natural Selection Operators for Software Fault Prediction

Abstract: Software fault prediction (SFP) is a challenging process that any successful software should go through it to make sure that all software components are free of faults. In general, soft computing and machine learning methods are useful in tackling this problem. The size of fault data is usually huge since it is obtained from mining software historical repositories. This data consists of a large number of features (metrics). Determining the most valuable features (i.e., Feature Selection (FS) is an excellent so… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
25
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 56 publications
(25 citation statements)
references
References 90 publications
0
25
0
Order By: Relevance
“…The AUC value is 1 and 107 second average computational cost. In [19], the improved versions of WOA by merging with a single-point cross-over technique are proposed that uses the five different FS techniques, i.e., random, tournament, Roulette wheel, stochastic universal sampling, and linear rank. The computational cost of the given model is high.…”
Section: Machine Learning Techniques For Defect Prediction Formentioning
confidence: 99%
See 1 more Smart Citation
“…The AUC value is 1 and 107 second average computational cost. In [19], the improved versions of WOA by merging with a single-point cross-over technique are proposed that uses the five different FS techniques, i.e., random, tournament, Roulette wheel, stochastic universal sampling, and linear rank. The computational cost of the given model is high.…”
Section: Machine Learning Techniques For Defect Prediction Formentioning
confidence: 99%
“…The most promising methods are ML algorithms such as the K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Naïve Bayes (NB) and Logistic Regression (LR) [18][19][20][21], Ensemble classifiers [22][23][24][25], and other different feature selection techniques [16,18,20,21] are also used in research [14]. Machine learning techniques are the soul of data mining, which are used successfully for solving the complicated problems either in industry or research [16].…”
Section: Introductionmentioning
confidence: 99%
“…WOA has shown high exploration ability. Unlike other meta-heuristic algorithms, WOA updates the position vector of a whale (solution) in the exploration stage with respect to the position vector of a randomly chosen search agent rather than the optimal search agent discovered so far [17,[34][35][36]. Like other meta-heuristic algorithms, WOA has drawbacks like early convergence and the ease of falling into the local optimum.…”
Section: Introductionmentioning
confidence: 99%
“…Hence, scholars have made several improvements to the basic version of WOA to overcome its limitation and employed it to solve various optimization problems. For instance, [35] proposed an improved version of WOA based on Natural Selection Operators and applied it as a wrapper feature selection method for software fault prediction. Mafarja and Mirjalili [17] combined WOA with simulated annealing (SA) algorithm to enhance its exploitation ability and applied their enhanced WOA-based approach for feature selection.…”
Section: Introductionmentioning
confidence: 99%
“…Dataset (VDS) and Invulnerable Dataset (IDS).Performance enhancement of the presented framework is estimated by a state-of-the-art classification model, 2 baseline classifiers models are considered for this comparative analysis. Specifically, the suggested LBN model is compared with Bayesian Belief Network (BBN), Decision Tree (DT) and K-Nearest Neighbor (KNN) models[47]. After experimental simulation, the outcomes of the presented model under different techniques are tabulated as presented in Table6.…”
mentioning
confidence: 99%