2016
DOI: 10.1111/rssb.12184
|View full text |Cite
|
Sign up to set email alerts
|

Confidence Intervals and Regions for the Lasso by Using Stochastic Variational Inequality Techniques in Optimization

Abstract: Sparse regression techniques have been popular in recent years because of their ability in handling high dimensional data with built-in variable selection.The lasso is perhaps one of the most well-known examples. Despite intensive work in this direction, how to provide valid inference for sparse regularized methods remains a challenging statistical problem. We take a unique point of view of this problem and propose to make use of stochastic variational inequality techniques in optimization to derive confidence… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
22
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 22 publications
(23 citation statements)
references
References 26 publications
1
22
0
Order By: Relevance
“…Zhang and Zhang (2014) achieved valid post-inference by adjusting the bias introduced by the l 1 term in lasso. Lu et al (2017) applied stochastic variational inequality theory in optimization in lasso inference. Lasso and its inference are also studied in a Bayesian way.…”
Section: Lasso Inferencementioning
confidence: 99%
See 2 more Smart Citations
“…Zhang and Zhang (2014) achieved valid post-inference by adjusting the bias introduced by the l 1 term in lasso. Lu et al (2017) applied stochastic variational inequality theory in optimization in lasso inference. Lasso and its inference are also studied in a Bayesian way.…”
Section: Lasso Inferencementioning
confidence: 99%
“…Conventional inference methods often lead to invalid post-selection inference, and it remains challenging to construct confidence sets or perform hypothesis tests on lasso type estimators due to the complexity of their limiting distributions. In recent years, many efforts have been made to study the post-selection inference for lasso estimates (Lee et al, 2013, Lockhart et al, 2014, Zhang and Zhang, 2014, Minnier et al, 2011, Chatterjee and Lahiri, 2010, Camponovo, 2015, Lu et al, 2017 . Since we solve the generalized lasso problem by re-parameterizing it to a regular lasso, we can take advantage of current inference methods on lasso estimators and adapt them for use in the generalized lasso problem.…”
Section: Generalized Lasso Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…5.6]), at least when problem relaxations can be solved to near global optimality. Validation approaches based on optimality conditions are found in [19,43,32,26,25] and [42,Sect. 5.6].…”
Section: Introductionmentioning
confidence: 99%
“…proposed a general method to construct asymptotically uniformly valid CIs post-model-selection using the principles inBerk et al (2013) Lee et al (2016). developed a general approach for valid inference after model selection by characterizing the distribution of the post selection-estimator conditional on the selection event Lu et al (2017). investigated the CI problem from a different point of view, and they used stochastic variational inequality techniques in optimization to derive CIs for the LASSO estimator.…”
mentioning
confidence: 99%