2020
DOI: 10.3390/math8010110
|View full text |Cite
|
Sign up to set email alerts
|

Combination of Ensembles of Regularized Regression Models with Resampling-Based Lasso Feature Selection in High Dimensional Data

Abstract: In high-dimensional data, the performances of various classifiers are largely dependent on the selection of important features. Most of the individual classifiers with the existing feature selection (FS) methods do not perform well for highly correlated data. Obtaining important features using the FS method and selecting the best performing classifier is a challenging task in high throughput data. In this article, we propose a combination of resampling-based least absolute shrinkage and selection operator (LAS… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(14 citation statements)
references
References 44 publications
0
14
0
Order By: Relevance
“…The feature performance FP of all features for m models is obtained to create a feature performance matrix FP mat . The different metrics are used as FP in literature like a coefficient estimate of feature [15,16], goodness of fit [18], feature importance [28] and feature significance [21]. For illustration purposes, feature importance a is used as FP .…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…The feature performance FP of all features for m models is obtained to create a feature performance matrix FP mat . The different metrics are used as FP in literature like a coefficient estimate of feature [15,16], goodness of fit [18], feature importance [28] and feature significance [21]. For illustration purposes, feature importance a is used as FP .…”
Section: Methodsmentioning
confidence: 99%
“…The different metrics are used as ‫ܲܨ‬ in literature like a coefficient estimate of feature [15,16], goodness of fit [18], feature importance [28] and feature significance [21]. For illustration purposes, feature importance ߙ is used as ‫.ܲܨ‬ Since, in any model, only ݈| ݈ ‫‬ features are used, thus many cells in the ‫ܲܨ‬ ௧ will not have a ߙ value.…”
Section: ‫ܲܨ‬ ௧ ൌ ห‫ܲܨ‬ ห | ݅ ‫א‬ ሼ1 … ݇ሽ ݆ ‫א‬ ሼ1 … ‫‬ሽ#ሺ2ሻmentioning
confidence: 99%
See 2 more Smart Citations
“…These two algorithms are suitable for the resampling and feature selection of high-dimensional data with many features [30,31]. Moreover, these two algorithms have been applied well in disease diagnosis and image processing [32,33]. One of the goals of this study was to determine whether ML techniques can improve the selection of spectral bands.…”
Section: Introductionmentioning
confidence: 99%