2017
DOI: 10.1534/genetics.116.192195
|View full text |Cite
|
Sign up to set email alerts
|

The Spike-and-Slab Lasso Generalized Linear Models for Prediction and Associated Genes Detection

Abstract: Large-scale "omics" data have been increasingly used as an important resource for prognostic prediction of diseases and detection of associated genes. However, there are considerable challenges in analyzing high-dimensional molecular data, including the large number of potential molecular predictors, limited number of samples, and small effect of each predictor. We propose new Bayesian hierarchical generalized linear models, called spike-and-slab lasso GLMs, for prognostic prediction and detection of associate… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
43
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(47 citation statements)
references
References 44 publications
(52 reference statements)
1
43
0
Order By: Relevance
“…In contrast to the minimal-optimal problem, the all-relevant problem is usually motivated by the need to identify features that are "significant" to the target variable [17], either in order to further investigate their dependencies, e.g. to find exploratory directions in gene micro-array research [23,24], or in order to enable a more interactive model design process, e.g. to design classifiers that take into account expert knowledge and the costs of acquiring each feature.…”
Section: All-relevantmentioning
confidence: 99%
“…In contrast to the minimal-optimal problem, the all-relevant problem is usually motivated by the need to identify features that are "significant" to the target variable [17], either in order to further investigate their dependencies, e.g. to find exploratory directions in gene micro-array research [23,24], or in order to enable a more interactive model design process, e.g. to design classifiers that take into account expert knowledge and the costs of acquiring each feature.…”
Section: All-relevantmentioning
confidence: 99%
“…Besides, to shrink the coefficients of unimportant linear and nonlinear effects to zero exactly, we adopt the spike-and-slab priors in our model. The spike-and-slab priors have recently been shown as effective when being incorporated in Bayesian hierarchical framework for penalization methods, including the spike-and-slab LASSO, 20,21 Bayesian fused LASSO, 22 and Bayesian sparse group LASSO. 23 It leads to sparsity in the sense of exact 0 posterior estimates which are not available in Bayesian LASSO type of Bayesian shrinkage methods including that in the work of Li et al 18 Motivated by the pressing need to conduct efficient Bayesian G×E interaction studies accounting for the nonlinear interaction effects, the proposed semiparametric model significantly advances from existing Bayesian variable selection methods for G×E interactions in the following aspects.…”
Section: Introductionmentioning
confidence: 99%
“…These methods include elastic nets [30] that add an additional ridge regression-like penalty to improve performance when the number of genetic variants is significantly larger than the sample size. Another approach is a spike-and-slab Lasso (ssLasso) method [23] that addresses the limitations of Lasso by imposing weak or no shrinkage on related features while imposing strong shrinkage on unrelated features. Bayesian-based methods are proposed for phenotype prediction as well, which includes a hybrid of LMMs and sparse regression models such as Bayesian Lasso [17] and Bayesian sparse linear mixed model (BSLMM) [29].…”
Section: Introductionmentioning
confidence: 99%