2019
DOI: 10.11591/ijai.v8.i4.pp399-410
|View full text |Cite
|
Sign up to set email alerts
|

Improving Software Development effort estimating using Support Vector Regression and Feature Selection

Abstract: Accurate and reliable software development effort estimation (SDEE) is one of the main concerns for project managers. Planning and scheduling a software projects using and inaccurate estimate may cause severe risks to software project under development such as delayed delivery, poor quality software, missing features. Therefore, accurate prediction of software effort plays an important role in the minimization of these risks that can lead to projects failure. Nowadays, application of artificial intelligence te… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 16 publications
(22 citation statements)
references
References 46 publications
0
22
0
Order By: Relevance
“…As we can see from our previous result that support vector machine (SVM) and KNeighbors Classifiers achieved good accuracy when we minimized the number of features from 173 to 20. Since SVM and KNeighbors classifiers are computationally expensive due to the implementation of quadratic programming and require more time to execute classification [34,35], so reducing features helps in reducing the computation process and improving accuracy as well [37]. However, when we used probabilistic classifiers such as Naïve Bayes, the performance of accuracy is also improved with a 90.4 % rate using 10 features.…”
Section: Discussionmentioning
confidence: 99%
“…As we can see from our previous result that support vector machine (SVM) and KNeighbors Classifiers achieved good accuracy when we minimized the number of features from 173 to 20. Since SVM and KNeighbors classifiers are computationally expensive due to the implementation of quadratic programming and require more time to execute classification [34,35], so reducing features helps in reducing the computation process and improving accuracy as well [37]. However, when we used probabilistic classifiers such as Naïve Bayes, the performance of accuracy is also improved with a 90.4 % rate using 10 features.…”
Section: Discussionmentioning
confidence: 99%
“…Which these changes will affect the actual plan and schedule for a project estimation, leading to providing inaccurate plan which will cause a delay on the actual delivery date. Having such this negative impact can cause critical risks [24].…”
Section: -No Changing Accepted In Working Sprintsmentioning
confidence: 99%
“…After this study's experiment, we used three methods to discover the most influential attributes-namely, correlation-based, information-gain, and relief. While the correlation-based feature selection is a greedy search method, the others are rank-based search methods [22]. By using these methods, we have identified ten attributes as the most influential features.…”
Section: Feature Selection Algorithmsmentioning
confidence: 99%