2011
DOI: 10.1198/jasa.2011.tm10159
|View full text |Cite
|
Sign up to set email alerts
|

Bootstrapping Lasso Estimators

Abstract: In this article, we consider bootstrapping the Lasso estimator of the regression parameter in a multiple linear regression model. It is known that the standard bootstrap method fails to be consistent. Here, we propose a modified bootstrap method, and show that it provides valid approximation to the distribution of the Lasso estimator, for all possible values of the unknown regression parameter vector, including the case where some of the components are zero. Further, we establish consistency of the modified bo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
233
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 245 publications
(235 citation statements)
references
References 24 publications
1
233
0
Order By: Relevance
“…All of these procedures yield qualitatively similar conclusions. Note that we do not report results for a standard nonparametric bootstrap; the standard nonparametric bootstrap is known to be invalid for lasso regression (Chatterjee and Lahiri 2011).…”
Section: Inferencementioning
confidence: 99%
“…All of these procedures yield qualitatively similar conclusions. Note that we do not report results for a standard nonparametric bootstrap; the standard nonparametric bootstrap is known to be invalid for lasso regression (Chatterjee and Lahiri 2011).…”
Section: Inferencementioning
confidence: 99%
“…Methods such as adaptive Lasso and boostrapping Lasso have been proposed for these cases [1], [3], [13], [15]. Building on work done for the linear model in Equation 1 [4], [9], in this paper, we show that Lasso is consistent when only part of the underlying function is linear as p and n grow large.…”
Section: Introductionmentioning
confidence: 93%
“…The Adaptive Lasso ensures that the bootstrap (Section 3. 3) is consistent (Chatterjee and Lahiri, 2011). We take the weights of the Adaptive Lassô…”
Section: Penalized Maximum Likelihood Estimationmentioning
confidence: 99%
“…Recently, a small but growing literature on inference in penalized regression models for cross-sectional data has arisen, such as Wasserman and Roeder (2009), Meinshausen et al (2009) and Chatterjee and Lahiri (2011). We extend the residual bootstrap procedure of Chatterjee and Lahiri (2011) to high-dimensional time series data.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation