2021
DOI: 10.1080/01621459.2020.1847121
|View full text |Cite|
|
Sign up to set email alerts
|

Variational Bayes for High-Dimensional Linear Regression With Sparse Priors

Abstract: We study a mean-field spike and slab variational Bayes (VB) approximation to Bayesian model selection priors in sparse high-dimensional linear regression. Under compatibility conditions on the design matrix, oracle inequalities are derived for the mean-field VB approximation, implying that it converges to the sparse truth at the optimal rate and gives optimal prediction of the response vector. The empirical performance of our algorithm is studied, showing that it works comparably well as other state-of-the-art… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
39
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 48 publications
(41 citation statements)
references
References 41 publications
0
39
0
Order By: Relevance
“…Nevertheless, what it is always convincing is a comparison with the more recent methodologies to cope with scenarios of high dimensionality. One such method is the proposal in [29]-sparsevb-that replaces the actual posterior distribution with a variational approximation that allows for much faster computations. In their implementation, the authors use a spike-and-slab prior based on the Laplace density for the regression parameters and a prior over the model space that induces sparsity (favouring simpler models).…”
Section: Implementation Of the Bayesian Approach And Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Nevertheless, what it is always convincing is a comparison with the more recent methodologies to cope with scenarios of high dimensionality. One such method is the proposal in [29]-sparsevb-that replaces the actual posterior distribution with a variational approximation that allows for much faster computations. In their implementation, the authors use a spike-and-slab prior based on the Laplace density for the regression parameters and a prior over the model space that induces sparsity (favouring simpler models).…”
Section: Implementation Of the Bayesian Approach And Resultsmentioning
confidence: 99%
“…In their implementation, the authors use a spike-and-slab prior based on the Laplace density for the regression parameters and a prior over the model space that induces sparsity (favouring simpler models). Our main interest in the paper [29] is their comparison with a large number of competing Bayesian methods conceived for variable selection in high dimensional settings. In particular, they consider varbvs which is a related variational Bayes procedure but with the spike-and-slab prior using the Gaussian density.…”
Section: Implementation Of the Bayesian Approach And Resultsmentioning
confidence: 99%
“…It is worth to note that, by direct SNP filtering using epigenetic annotations, both “elnt.annot” and “vb.annot” had many fewer genes with high quality imputation models, implying that stringent SNP categorization might lead to loss in power of detecting genes with considerable expression components explained by cis-eQTL. Compared to elastic net methods (elnt and elnt.annot), variational Bayes (both vb and T-GEN) methods better imputed genes, which are partly attributed to its improved variable selection performance [ 37 ].…”
Section: Resultsmentioning
confidence: 99%
“…Until now, it is difficult to compute our MCMC methods when the dimension is high. Recently, the variational Bayesian method has been received attention [e.g., Ray and Szabó (2020) and Wang and Blei (2019)] as an alternative to MCMC. In addition, Johndrow et al (2020) proposes to speed up MCMC by improving the conventional sampling method.…”
Section: Discussionmentioning
confidence: 99%