2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP) 2015
DOI: 10.1109/camsap.2015.7383786
|View full text |Cite
|
Sign up to set email alerts
|

EEG source localization based on a structured sparsity prior and a partially collapsed Gibbs sampler

Abstract: In this paper, we propose a hierarchical Bayesian model approximating the ℓ20 mixed-norm regularization by a multivariate Bernoulli Laplace prior to solve the EEG inverse problem by promoting spatial structured sparsity. The posterior distribution of this model is too complex to derive closed-form expressions of the standard Bayesian estimators. An MCMC method is proposed to sample this posterior and estimate the model parameters from the generated samples. The algorithm is based on a partially collapsed Gibbs… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…We have observed that the standard PCGS developed in [8] gets sometimes stuck around local maxima of the posterior 8. This section studies two kinds of moves which allow the sampler to escape from these local maxima, and thus ensure a faster convergence of the PCGS.…”
Section: A Multiple Dipole Shift Proposalsmentioning
confidence: 95%
See 2 more Smart Citations
“…We have observed that the standard PCGS developed in [8] gets sometimes stuck around local maxima of the posterior 8. This section studies two kinds of moves which allow the sampler to escape from these local maxima, and thus ensure a faster convergence of the PCGS.…”
Section: A Multiple Dipole Shift Proposalsmentioning
confidence: 95%
“…To estimate the model parameters, we proposed in [8] to draw samples from (8) using a PCGS, sampling z i and x i jointly. The corresponding conditional distributions are summarized in Table I, where GIG, IG and Be are the generalized inverse Gaussian, inverse gamma and beta distributions (see also [8]). Note that X −i denotes the matrix X whose i-th row has been set to zero and that the following notations have been used…”
Section: F Partially Collapsed Gibbs Samplermentioning
confidence: 99%
See 1 more Smart Citation
“…The use of this effect has demonstrated its ability to provide flexible and adaptive regularizations for the resolution of ill-posed inverse problems. Generally, the use of a Bernoulli and continuous distributions mixture priors like multivariate Bernoulli-Laplace [24,25], Bernoulli-Gaussian [26], Bernoulli-Exponential [20] and Bernoulli-generalized Gaussian-Laplace [27] allows promoting the desired image sparsity levels directly in the original space. These Bayesian Bernoullibased models enable to estimate the regularization parameters and hyperparameters directly from the observed data, which is not possible with other variational methods [6,13,17,28].…”
Section: Introductionmentioning
confidence: 99%