2011
DOI: 10.1016/j.jmp.2011.06.001
|View full text |Cite
|
Sign up to set email alerts
|

A tutorial on Bayes factor estimation with the product space method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
101
0
2

Year Published

2012
2012
2018
2018

Publication Types

Select...
7
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 98 publications
(104 citation statements)
references
References 53 publications
1
101
0
2
Order By: Relevance
“…In case the data provides strong evidence in favor of one of the models, it is very unlikely that the MCMC sampler visits the other model and a posterior probability ratio cannot be computed correctly from the samples. Hence, we adapted a product space method (Lodewyckx et al, 2011) that iteratively adjusts prior model probabilities until the posterior probabilities are roughly equal. Then, the BF can be computed from the known prior model probabilities and the sampled posterior model probabilities.…”
Section: Computational Modelingmentioning
confidence: 99%
“…In case the data provides strong evidence in favor of one of the models, it is very unlikely that the MCMC sampler visits the other model and a posterior probability ratio cannot be computed correctly from the samples. Hence, we adapted a product space method (Lodewyckx et al, 2011) that iteratively adjusts prior model probabilities until the posterior probabilities are roughly equal. Then, the BF can be computed from the known prior model probabilities and the sampled posterior model probabilities.…”
Section: Computational Modelingmentioning
confidence: 99%
“…Vandekerckhove et al (39) shows how to use posterior distributions obtained from WinBUGS and JAGS to compute Bayes factors for non-nested MPTs using importance sampling. Lodewyckx et al (14) outline a WinBUGS implementation of the product-space method, a transdimensional MCMC approach for computing Bayes factors for nested and non-nested models. Most recently, Gronau et al (7) provide a tutorial on bridge sampling-a new, potentially very powerful method that is under active development.…”
Section: Resultsmentioning
confidence: 99%
“…If the participant is classified as having linear-increase learning, the model inferences are a bounded line based on the learning rate parameter. In each panel, the Bayes factor, derived from the posterior expectation of the z i indicator parameters, is also shown (Lodewyckx et al, 2011). Thus, for example, the Bayes factor for participant 14 is 30 in favor of step-change learning, in which their accuracy improves from about 65% to about 90% at the sixth block, while the the Bayes factor for participant 15 is 26 in favor of linear-increase learning, in which their accuracy improves from 50% to about 90% over the blocks.…”
Section: Graphical Modelmentioning
confidence: 99%