2021
DOI: 10.1088/1742-6596/1879/3/032014
|View full text |Cite
|
Sign up to set email alerts
|

Regression shrinkage and selection variables via an adaptive elastic net model

Abstract: In this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 10 publications
0
6
0
Order By: Relevance
“…This is particularly useful for models that benefit from variable reduction/selection. Similar to Ridge, the strength of the regularization is controlled by a hyperparameter, λ. Lasso regression satisfies the next optimization problem [34].…”
Section: Methodsmentioning
confidence: 99%
“…This is particularly useful for models that benefit from variable reduction/selection. Similar to Ridge, the strength of the regularization is controlled by a hyperparameter, λ. Lasso regression satisfies the next optimization problem [34].…”
Section: Methodsmentioning
confidence: 99%
“…The General Linear Model (GLM) is one of the most widely used models in Various fields of statistical analysis, and the Ordinary Least Squares (OLS) method is one of the most common methods for estimating the parameters of the general linear model, as Follows: [7,12,18]…”
Section: Methods 21 General Linear Modelmentioning
confidence: 99%
“…Therefore, based on this constraint, Lasso regression works to shrink the regression parameters and set them equal to Zero, as well as variables greater than Zero are determined after reduction and adopted as part of the model, which contributes to minimizing the prediction error and thus preserving the good features of both stepwise selection methodology and ridge regression method (RR). This method is of great importance to deal with the problem of multicollinearity between explanatory variables [7,8,11].…”
Section: Least Absolute Shrinkage and Selection Operator Estimator (L...mentioning
confidence: 99%
See 2 more Smart Citations