2021
DOI: 10.1007/s11334-020-00379-y
|View full text |Cite
|
Sign up to set email alerts
|

A pragmatic ensemble learning approach for effective software effort estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(15 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…After that, proposed MoWE model is compared with other heterogenous ensemble methods found in literature, including gradient boosting [ 75 ] stacking [ 76 ] majority voting and weighted ensemble (with non-optimized weights) [ 30 ]. Further, performance comparison of our proposed MoMdbWE techniques is made with previous EEE studies ( Table 11 ).…”
Section: Methodsmentioning
confidence: 99%
“…After that, proposed MoWE model is compared with other heterogenous ensemble methods found in literature, including gradient boosting [ 75 ] stacking [ 76 ] majority voting and weighted ensemble (with non-optimized weights) [ 30 ]. Further, performance comparison of our proposed MoMdbWE techniques is made with previous EEE studies ( Table 11 ).…”
Section: Methodsmentioning
confidence: 99%
“…However, it is highly subjective and may be affected by biases or limitations in the knowledge and experience of the experts involved. Machine learning algorithms have also been used for cost estimation in software development, with promising results (Kumar et al, 2021;Zhao & Zhang, 2020;Panda & Majhi, 2020;Promise Software Engineering Repository, n.d.). These algorithms can analyze large amounts of data and identify patterns and relationships that are not easily discernible through other approaches.…”
Section: Related Workmentioning
confidence: 99%
“…Hidmi and Sakar 34 used ensemble learning by combining SVM and kNN models to estimate the effort on Desharnais and Maxwell data; the authors validated the results and identified kNN outperformed and ensemble learning improved the results. Suresh Kumar et al 35 used gradient boosting, an ensemble learning algorithm for effort estimation on COCOMO81 and China datasets, and obtained R2 values of 98% and 93%, respectively.…”
Section: Related Workmentioning
confidence: 99%