2019
DOI: 10.23940/ijpe.19.08.p4.20492061
|View full text |Cite
|
Sign up to set email alerts
|

Proposed Hybrid Approach to Predict Software Fault Detection

Abstract: The major challenge is to validate software failure dataset by finding unknown model parameters used. Previously, many attempts for software assurance were made using classical classifiers as Decision Tree, Naïve Bayes, and k-NN for software fault prediction. But the accuracy of fault prediction is very low as defect prone modules are very small as compared to defect-free modules. So, for solving modules fault classification problems and enhancing reliability accuracy, a hybrid algorithm proposed on Particle S… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…In order to produce the optimal feature subset, PSO will ends when the requirements are satisfied. PSO position and velocity variations are derived from basic formulas (2) and (3) [31].…”
Section: Feature Selection 1 Pso Feature Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to produce the optimal feature subset, PSO will ends when the requirements are satisfied. PSO position and velocity variations are derived from basic formulas (2) and (3) [31].…”
Section: Feature Selection 1 Pso Feature Selectionmentioning
confidence: 99%
“…On the other hand, the second formula explains how the velocity of the particle at time step is updated by considering the contributions from the personal best position (Pbest) and the global best position (Gbest) that the particle itself and the entire population have achieved respectively [31]. The PSO algorithm's performance is optimized for optimal problem solving by the adjustment of coefficients (c1 and c2) and randomization (r1 and r2) [32].…”
Section: Feature Selection 1 Pso Feature Selectionmentioning
confidence: 99%
“…PSO feature selection is employed to find the optimal feature subset, particularly effective for datasets with noisy attributes [30], [31]. In PSO, a population of particles represents feature subsets in a binary manner (1 for inclusion, 0 for exclusion).…”
Section: Classification With Particle Swarm Optimization Feature Sele...mentioning
confidence: 99%
“…Particles adapt to personal best results (Pbest) and the overall best in the population (Gbest) during iterations, with PSO terminating when criteria are met to yield the best feature subset. PSO's position and velocity changes derive from basic formulas (2) and (3) [10], [31]:…”
Section: Classification With Particle Swarm Optimization Feature Sele...mentioning
confidence: 99%