2022
DOI: 10.1016/j.matpr.2021.08.097
|View full text |Cite
|
Sign up to set email alerts
|

African buffalo optimized multinomial softmax regression based convolutional deep neural network for software fault prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 15 publications
0
2
0
Order By: Relevance
“…Using different AF and optimizer, there are four AFs for the hidden layer that were considered: Swish [ 78 , 79 , 80 ], Tanh [ 81 , 82 , 83 ], Elu [ 84 , 85 , 86 ], and Sigmoid [ 87 , 88 , 89 ]. Two AFs are considered for the output layer: Sigmoid [ 79 , 81 , 88 , 89 ] and Softmax [ 90 , 91 ]. Lastly, three optimizers are used, namely, Adam [ 92 , 93 , 94 ], RMSProp [ 90 , 95 , 96 ], and SGD [ 97 , 98 ].…”
Section: Methodsmentioning
confidence: 99%
“…Using different AF and optimizer, there are four AFs for the hidden layer that were considered: Swish [ 78 , 79 , 80 ], Tanh [ 81 , 82 , 83 ], Elu [ 84 , 85 , 86 ], and Sigmoid [ 87 , 88 , 89 ]. Two AFs are considered for the output layer: Sigmoid [ 79 , 81 , 88 , 89 ] and Softmax [ 90 , 91 ]. Lastly, three optimizers are used, namely, Adam [ 92 , 93 , 94 ], RMSProp [ 90 , 95 , 96 ], and SGD [ 97 , 98 ].…”
Section: Methodsmentioning
confidence: 99%
“…Studies related to software fault prediction area are summarized in this section. Saravanan et al (Saravanan et al, 2021) proposed an African buffalo optimizer based convolution neural network for fast training in the software fault prediction field. Kassaymeh et al (Kassaymeh et al, 2021) used a salp swarm optimizer for neural network training instead of backpropagation.…”
Section: Related Workmentioning
confidence: 99%