Proceedings of the 16th ACM International Conference on Predictive Models and Data Analytics in Software Engineering 2020
DOI: 10.1145/3416508.3417121
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating hyper-parameter tuning using random search in support vector machines for software effort estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(14 citation statements)
references
References 36 publications
0
7
0
Order By: Relevance
“…1) Support Vector Regression (SVR) SVR has been widely studied by researchers in the area of SDEE [24], [50], [51]. It is machine learning technique which works by mapping non-linear separable patterns in data into higher feature space with an aim of minimizing the loss function along with maximizing support vector bounds.…”
Section: Effort Estimation Methodsmentioning
confidence: 99%
“…1) Support Vector Regression (SVR) SVR has been widely studied by researchers in the area of SDEE [24], [50], [51]. It is machine learning technique which works by mapping non-linear separable patterns in data into higher feature space with an aim of minimizing the loss function along with maximizing support vector bounds.…”
Section: Effort Estimation Methodsmentioning
confidence: 99%
“…The COCOMO is still in practice; researchers and practitioners are extending and modifying it with different concepts to introduce new effort estimation techniques as mentioned in these papers [14][15][16][17]. All the papers are published in IEEE publisher.…”
Section: Cocomo Imentioning
confidence: 99%
“…Most of the previous work in hyperparameter tuning tends to focus only on Grid Search and Random Search, or a comparison between them [9][10][11][12][13]. A comprehensive study on tunability is given in [14]; the authors constructed the tuning problem in statistical terms as well as suggesting tunability quantifying measures of algorithms' hyperparameters.…”
Section: Hyperparameter Tunningmentioning
confidence: 99%