2007
DOI: 10.1198/016214507000000121
|View full text |Cite
|
Sign up to set email alerts
|

Shotgun Stochastic Search for “Largep” Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
191
0
1

Year Published

2010
2010
2021
2021

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 203 publications
(192 citation statements)
references
References 1 publication
0
191
0
1
Order By: Relevance
“…Here, we extend iBMA to linear regression by rank ordering putative regulators using the coefficient of determination (R 2 ) from single-variable models and then iteratively applying BMA to the top ranked genes, removing variables with low posterior probabilities (see Materials and Methods). Other methods for implementing BMA for linear regression with highdimensional data have also been proposed more recently (25)(26)(27)(28)(29).…”
Section: Resultsmentioning
confidence: 99%
“…Here, we extend iBMA to linear regression by rank ordering putative regulators using the coefficient of determination (R 2 ) from single-variable models and then iteratively applying BMA to the top ranked genes, removing variables with low posterior probabilities (see Materials and Methods). Other methods for implementing BMA for linear regression with highdimensional data have also been proposed more recently (25)(26)(27)(28)(29).…”
Section: Resultsmentioning
confidence: 99%
“…Thus, they are more flexible in avoiding 'sticky patches' in model space; see the discussion in, for example, Dellaportas and Roberts (2003). Indeed, Hans et al (2007) point out that the chain may fail to move if it starts at a region of zero posterior probability mass. In more simulation studies this has been also noted by Petralias (2010).…”
Section: ′ ′mentioning
confidence: 99%
“…Algorithms 1 and 2 have a similar philosophy with the MCMC version of the Shotgun stochastic search Algorithm of Hans et al (2007) but instead of attempting to move directly to all models in , they propose more local moves. Thus, they are more flexible in avoiding 'sticky patches' in model space; see the discussion in, for example, Dellaportas and Roberts (2003).…”
Section: ′ ′mentioning
confidence: 99%
See 1 more Smart Citation
“…Jones et al (2005) and Hans et al (2007) highlight this question in the context of Gaussian graphical models and regression variable selection. They introduce the shotgun stochastic search (SSS) method that is similar to MCMC but it focuses on aggresively moving towards regions of high posterior probability in the models space instead of attempting to sample from the posterior distribution over the models space.…”
Section: Introductionmentioning
confidence: 99%