2012
DOI: 10.1177/0962280212448973
|View full text |Cite
|
Sign up to set email alerts
|

Comparing variational Bayes with Markov chain Monte Carlo for Bayesian computation in neuroimaging

Abstract: In this article, we consider methods for Bayesian computation within the context of brain imaging studies. In such studies, the complexity of the resulting data often necessitates the use of sophisticated statistical models; however, the large size of these data can pose significant challenges for model fitting. We focus specifically on the neuroelectromagnetic inverse problem in electroencephalography, which involves estimating the neural activity within the brain from electrode-level data measured across the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
5
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 30 publications
1
5
0
Order By: Relevance
“…As such the posterior variance for VB and INLA are again listed relative to that obtained from HMC in order to determine the extent to which these approaches under-estimate or over-estimate posterior variability. For VB we see that the marginal posterior variance is under-estimated which is in line with expectations from the literature [34, 32]. INLA I and II provide measures of variability that are closer to that of HMC, while INLA III and IV tend to over-estimate the marginal posterior variance, with this over-estimation being substantial when the smaller mesh size is used.…”
Section: Simulation Studiessupporting
confidence: 88%
“…As such the posterior variance for VB and INLA are again listed relative to that obtained from HMC in order to determine the extent to which these approaches under-estimate or over-estimate posterior variability. For VB we see that the marginal posterior variance is under-estimated which is in line with expectations from the literature [34, 32]. INLA I and II provide measures of variability that are closer to that of HMC, while INLA III and IV tend to over-estimate the marginal posterior variance, with this over-estimation being substantial when the smaller mesh size is used.…”
Section: Simulation Studiessupporting
confidence: 88%
“…The posterior variance statistics obtained from VB are also fairly close to those obtained from HMC, with slightly larger values for the former. Thus the over-confidence problem sometimes associated with VB ([21], [4]) does not seem to be an issue in this case. Both algorithms are performing well in terms of point estimation as they achieve a high level of correlation (around 0.99) with the true values.…”
Section: Resultsmentioning
confidence: 87%
“…While this approach often leads to computational efficiency, there are potential concerns with its accuracy. [4] have discussed this issue and demonstrated examples with neuroimaging data where the mean field variational Bayes approximation can severely underestimate posterior variability and produce biased estimates of model hyper-parameters. [5] study the performance of VB in a simulation study based on fMRI and raise concerns that while VB reduces computational cost it can suffer from lower specificity and smaller coverage of the credible intervals.…”
Section: Introductionmentioning
confidence: 99%
“…Our goal is to infer the posterior distributions over the factors U and V depending on the data X. To avoid using the heavy machinery of MCMC (Nathoo et al, 2013) to infer the intractable posterior of the latent variables in our model, we use the framework of variational inference (Hoffman et al, 2013). In particular, we extend the version of the variational EM algorithm (Beal and Ghahramani, 2003) proposed by Dikmen and Févotte (2012) in the context of the standard Gamma-Poisson factor model to our sparse and zero-inflated GaP model.…”
Section: Model Inference Using a Variational Em Algorithmmentioning
confidence: 99%