2014
DOI: 10.1179/1939787914y.0000000061
|View full text |Cite
|
Sign up to set email alerts
|

Prediction of compaction parameters of coarse grained soil using multivariate adaptive regression splines (MARS)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
10
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 45 publications
(17 citation statements)
references
References 13 publications
2
10
0
1
Order By: Relevance
“…In case of SVM, Han and Jiang [142] performed a thorough analysis with different kernels, and they found that Gaussian kernels can encounter overfitting issues due to C parameter, which controls the misclassification (i.e., if the C parameter is large, a smaller margin hyperplane tends to minimize misclassifications); thus, the forced accuracy raises the chance of overfitting, too. MARS was studied by Khuntia et al [143], and they found that due to its two-step prediction process, and especially the second step of backward variable selection (pruning) with removing the variables having the least contribution helps to handle the overfit. To sum it up, besides the potential risk of overfitting that increases when we apply many input variables, all the above cited studies agreed on the relevance of RKCV (or simple KCV) which is an important element for fine tuning ML algorithms in order to find the best hyperparameters, and also for minimizing the overfitting.…”
Section: Discussionmentioning
confidence: 99%
“…In case of SVM, Han and Jiang [142] performed a thorough analysis with different kernels, and they found that Gaussian kernels can encounter overfitting issues due to C parameter, which controls the misclassification (i.e., if the C parameter is large, a smaller margin hyperplane tends to minimize misclassifications); thus, the forced accuracy raises the chance of overfitting, too. MARS was studied by Khuntia et al [143], and they found that due to its two-step prediction process, and especially the second step of backward variable selection (pruning) with removing the variables having the least contribution helps to handle the overfit. To sum it up, besides the potential risk of overfitting that increases when we apply many input variables, all the above cited studies agreed on the relevance of RKCV (or simple KCV) which is an important element for fine tuning ML algorithms in order to find the best hyperparameters, and also for minimizing the overfitting.…”
Section: Discussionmentioning
confidence: 99%
“…Number of samples per group was calculated by using "clinical c" software. Sample size for each group was calculated by using the mean standard deviation values obtained from the study by ((Khuntia et al 2015). The values of threshold limit, g power, and confidence interval were 0.05, 80% and 95% respectively and the sample size required was estimated to be 17.…”
Section: Methodsmentioning
confidence: 99%
“…This strategy made the MARS method more advantageous and flexible than the other statistical methods in multivariate modeling studies [53]. More details about the MARS and its implementation can be found in [54][55][56].…”
Section: Multivariate Adaptive Regression Splines (Mars) Methodsmentioning
confidence: 99%