Data Science and Knowledge Engineering for Sensing Decision Support 2018
DOI: 10.1142/9789813273238_0170
|View full text |Cite
|
Sign up to set email alerts
|

Software fault prediction using data reduction approaches

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…Naive Bayes was the main classifier in [13], logistic regression used in [7], J48 decision tree used in [14], and random forest used in [21], Bayesian networks used in [15,16], multiple kernel ensemble learning [22], deep learning [23], and semi-supervised deep fuzzy clustering [17]. More recent works applied and proposed different methods for software fault prediction, such as Bayesian networks [15,16], deep learning [24], semi-supervised deep fuzzy clustering [17], faults prediction after reducing irrelevant, redundant features, and using reliable features [25,26], back-propagation neural networks [27], combining genetic algorithms with deep neural network DNN in [23] and with backpropagation learning algorithm in [28], multiple kernel ensemble learning [22], and non-negative sparse graph-based label propagation [29]. In our recent work [20], we explored the effect of g-lasso regression on software fault prediction.…”
Section: Related Workmentioning
confidence: 99%
“…Naive Bayes was the main classifier in [13], logistic regression used in [7], J48 decision tree used in [14], and random forest used in [21], Bayesian networks used in [15,16], multiple kernel ensemble learning [22], deep learning [23], and semi-supervised deep fuzzy clustering [17]. More recent works applied and proposed different methods for software fault prediction, such as Bayesian networks [15,16], deep learning [24], semi-supervised deep fuzzy clustering [17], faults prediction after reducing irrelevant, redundant features, and using reliable features [25,26], back-propagation neural networks [27], combining genetic algorithms with deep neural network DNN in [23] and with backpropagation learning algorithm in [28], multiple kernel ensemble learning [22], and non-negative sparse graph-based label propagation [29]. In our recent work [20], we explored the effect of g-lasso regression on software fault prediction.…”
Section: Related Workmentioning
confidence: 99%