2020
DOI: 10.1146/annurev-statistics-030718-105038
|View full text |Cite
|
Sign up to set email alerts
|

A Survey of Tuning Parameter Selection for High-Dimensional Regression

Abstract: Penalized (or regularized) regression, as represented by Lasso and its variants, has become a standard technique for analyzing highdimensional data when the number of variables substantially exceeds the sample size. The performance of penalized regression relies crucially on the choice of the tuning parameter, which determines the amount of regularization and hence the sparsity level of the fitted model. The optimal choice of tuning parameter depends on both the structure of the design matrix and the unknown r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 31 publications
(10 citation statements)
references
References 70 publications
0
7
0
Order By: Relevance
“…There still exits an important gap between the theory and practice of Lasso. See also Wu and Wang (2020) for a recent survey on tuning parameter selection for high-dimensional regression.…”
Section: Introductionmentioning
confidence: 99%
“…There still exits an important gap between the theory and practice of Lasso. See also Wu and Wang (2020) for a recent survey on tuning parameter selection for high-dimensional regression.…”
Section: Introductionmentioning
confidence: 99%
“…from data splitting) into the portfolio construction. One can also find a similar discussion in a survey on tuning parameter selection for regression models in [61]. In the context of regression, [55] shows that CV is asymptotically equivalent to the Akaike's information criterion (AIC).…”
mentioning
confidence: 76%
“…One of the core challenges in high-dimensional penalized regression is determining a suitable regularization parameter λ that trades off sparsity (i.e., interpretability) of the model coefficients and out-of-sample predictive 5/21 performance of the model [66,67]. Standard procedures for (hierarchical) interaction models include cross-validation [53] and Information Criteria, including the Aikake (AIC) and the extended Bayesian Information Criterion (BIC) [55].…”
Section: Stability-based Model Selection For Hierarchical Interactionsmentioning
confidence: 99%