2008
DOI: 10.1111/j.1467-9868.2008.00693.x
|View full text |Cite
|
Sign up to set email alerts
|

Shrinkage Tuning Parameter Selection with a Diverging number of Parameters

Abstract: Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g. the lasso and smoothly clipped absolute deviation) are found to be particularly useful for variable selection. Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang and co-worker have demonstrated that the tuning parameters selected … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

5
248
1

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 379 publications
(254 citation statements)
references
References 18 publications
5
248
1
Order By: Relevance
“…LetS c and S v be estimated index sets of S c and S v respectively. Then, repeating (10) and (11) Another key problem in the implementation is the choice of tuning parameters λ 1 , λ 2 and λ 3 , for which many approached have been developed, see e.g., ), Wang et al (2008 and Wang et al (2009). We use the Bayesian information criterion (BIC) for its finite sample performances (see, e.g.…”
Section: Model Identificationmentioning
confidence: 99%
See 1 more Smart Citation
“…LetS c and S v be estimated index sets of S c and S v respectively. Then, repeating (10) and (11) Another key problem in the implementation is the choice of tuning parameters λ 1 , λ 2 and λ 3 , for which many approached have been developed, see e.g., ), Wang et al (2008 and Wang et al (2009). We use the Bayesian information criterion (BIC) for its finite sample performances (see, e.g.…”
Section: Model Identificationmentioning
confidence: 99%
“…A recent line of research for variable selection has also undergone rapid development, see e.g., Fan & Li (2001), Wang et al (2009, Ma et al (2013). Among these studies, a working correlation model for the variance is often assumed.…”
Section: Introductionmentioning
confidence: 99%
“…For least square shrinkage methods, the generalized cross validation (GCV) and information criteria such as the Akaike information criterion (AIC) and Bayesian information criterion (BIC) are often used. While the GCV and AIC cannot identify the true model consistently , the BIC can (Wang and Leng 2007;Wang et al 2009). Zhang et al (2010) propose a general information criterion (GIC) that can nest the AIC and BIC and show its consistency in model selection.…”
Section: Introductionmentioning
confidence: 99%
“…Bogdan et al (2004) considered a criterion called modified BIC (mBIC) for QTL mapping models. Wang et al (2009) studied another modified BIC for models with a diverging number of parameters. Chen and Chen (2008) extended the original BIC to a family called extended BIC (EBIC) governed by a parameter γ.…”
Section: Introductionmentioning
confidence: 99%
“…The criterion considered by Wang et al (2009) modifies the original BIC by multiplying the second term of BIC with a diverging parameter and is somehow ad hoc. To achieve selection consistency, it requires p/n ξ < 1 for some 0 < ξ < 1, and hence is not applicable when p > n. The mBIC and EBIC considered by Bogdan et al (2004) and Chen and Chen (2008) respectively are developed from a Bayesian framework.…”
Section: Introductionmentioning
confidence: 99%