2020
DOI: 10.1109/access.2020.2968362
|View full text |Cite
|
Sign up to set email alerts
|

Hyper-Parameter Optimization of Classifiers, Using an Artificial Immune Network and Its Application to Software Bug Prediction

Abstract: Software testing is an important task in software development activities, and it requires most of the resources, namely, time, cost and effort. To minimize this fatigue, software bug prediction (SBP) models are applied to improve the software quality assurance (SQA) processes by predicting buggy components. The bug prediction models use machine learning classifiers so that bugs can be predicted in software components in some software metrics. These classifiers are characterized by some configurable parameters,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 42 publications
(23 citation statements)
references
References 28 publications
0
20
0
Order By: Relevance
“…Also, numerous benchmark functions namely Sphere, Himmelblau, Vincent, Grienwank, Discus, Six hump camel's back and Rastrigin (F 1 (H) to F 7 (H)) consisting of variable values, global optimal range, dimensional ranges, as well as local optimal value as mentioned in Table 3. The proposed GM‐NAINO algorithm is evaluated with various other optimization algorithms such as quasi opposition based modified levy flight distribution (QMLFD), artificial immune network optimization (AINO) algorithm, 48 black widow optimization (BWO) algorithm, 49 and cuckoo search (CS) optimization algorithm 50 as represented in Table 4. Here we utilized three different metrics namely the SD, best value and the average value are determined in obtaining the best value.…”
Section: Resultsmentioning
confidence: 99%
“…Also, numerous benchmark functions namely Sphere, Himmelblau, Vincent, Grienwank, Discus, Six hump camel's back and Rastrigin (F 1 (H) to F 7 (H)) consisting of variable values, global optimal range, dimensional ranges, as well as local optimal value as mentioned in Table 3. The proposed GM‐NAINO algorithm is evaluated with various other optimization algorithms such as quasi opposition based modified levy flight distribution (QMLFD), artificial immune network optimization (AINO) algorithm, 48 black widow optimization (BWO) algorithm, 49 and cuckoo search (CS) optimization algorithm 50 as represented in Table 4. Here we utilized three different metrics namely the SD, best value and the average value are determined in obtaining the best value.…”
Section: Resultsmentioning
confidence: 99%
“…The proposed model has an overall 19% improvement in accuracy in comparison with the state of the art techniques. Figure 9 signifies the performance comparison of different machine learning approaches on the edge layer of ToN-IoT dataset with the proposed approach [57,[60][61][62][63][64]. The proposed model integrates modified Tomek link under-sampling and hyper parameter tuning using AiNet approach for the supervised machine learning classifiers which enhances the overall performance on both the datasets.…”
Section: Discussionmentioning
confidence: 99%
“…An optimization version (Opt-aiNet) which can be used in optimization problems is available in aiNet [60]. A population is grown in Opt-aiNet which consist of a network of antibodies and the population size can be dynamically adjustable.…”
Section: Artificial Immune Network (Ainet) For Hyperparameter Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition to this, numerous test functions namely Sphere, Himmelblau, Vincent, Grienwank, Discus, Six hump camels back and Rastrigin (F1(H) to F7(H)) consisting of variable values, global optimal value, dimensional ranges as well as local optimal value as mentioned in table 3. The proposed GM-NAINO algorithm is evaluated with various other optimization algorithms such as quasi opposition based Modified Levy Flight Distribution (QMLFD), Artificial Immune Network Optimization (AINO) algorithm [28], Black Widow Optimization (BWO) algorithm [29] and Cuckoo search (CS) optimization algorithm [30] as represented in table 4. Here we utilized three different metrics namely the standard deviation, best value and the average value are determined in obtaining the best value.…”
Section: Optimization Performancesmentioning
confidence: 99%