1994
DOI: 10.1016/b978-0-444-81892-8.50040-7
|View full text |Cite
|
Sign up to set email alerts
|

Comparative study of techniques for large-scale feature selection* *This work was suported by a SERC grant GR/E 97549. The first author was also supported by a FPI grant from the Spanish MEC, PF92 73546684

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
52
0
1

Year Published

1995
1995
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 175 publications
(54 citation statements)
references
References 1 publication
1
52
0
1
Order By: Relevance
“…We used a two-way cross-validation on the validation set for hyper-parameters tuning to compute these imaging features. A sequential feature selection ( Ferri et al, 1994 ) was then conducted on the hybrid feature pool. Specifically, starting from an empty set, we picked one feature at a time from the remaining feature pool that minimized a validation loss.…”
Section: Methodsmentioning
confidence: 99%
“…We used a two-way cross-validation on the validation set for hyper-parameters tuning to compute these imaging features. A sequential feature selection ( Ferri et al, 1994 ) was then conducted on the hybrid feature pool. Specifically, starting from an empty set, we picked one feature at a time from the remaining feature pool that minimized a validation loss.…”
Section: Methodsmentioning
confidence: 99%
“…The additional features are evaluated by adding recursively the quantity which produces the best variation of the R 2 -score with respect to the previous value. This technique, commonly known as Sequential Forward Selection (SFS, Ferri et al, 1994), represents the most powerful tool to investigate medium and FOLDES ET AL. R is the score obtained using only the cross-phase as input features.…”
Section: Foldes Et Almentioning
confidence: 99%
“…The additional features are evaluated by adding recursively the quantity which produces the best variation of the R 2 -score with respect to the previous value. This technique, commonly known as Sequential Forward Selection (SFS, Ferri et al, 1994), represents the most powerful tool to investigate medium and R is the score obtained using only the cross-phase as input features. Methods with lower performance (Kernel Ridge Regression and Support Vector Regression) have an improvement with additional features while, as the model accuracy increases, they become less relevant.…”
Section: Foldes Et Almentioning
confidence: 99%
“…First, a nonlinear regression tree model, which was made with XGBoost framework (extreme gradient boosting) [27], was used to fit weekly new cases and COVID-19 growth rates. Next, sequential backward floating selection was iterated to train the XGBoost [28,29] model to obtain the final model by minimizing the mean squared error. Sequential backward floating selection is a sequential feature selection method based on a greedy search algorithm.…”
Section: Feature Selection and Feature Importancementioning
confidence: 99%
“…As a result, a nonlinear method was needed to assess the contribution of correlated factors on the spread of COVID-19 in China. Machine learning methods are good for solving nonlinear problems, and we used XGBoost [28,30] to create a nonlinear regression tree model for this study. Important factors were selected based on the cross-validation procedure in the XGBoost framework.…”
Section: Multifactor Analysis Based On the Machine Learning Algorithmmentioning
confidence: 99%