2014
DOI: 10.1088/0266-5611/30/11/114020
|View full text |Cite
|
Sign up to set email alerts
|

Sequential Monte Carlo samplers for semi-linear inverse problems and application to magnetoencephalography

Abstract: We discuss the use of a recent class of sequential Monte Carlo methods for solving inverse problems characterized by a semi-linear structure, i.e. where the data depend linearly on a subset of variables and non-linearly on the remaining ones. In this type of problems, under proper Gaussian assumptions one can marginalize the linear variables. This means that the Monte Carlo procedure needs only to be applied to the non-linear variables, while the linear ones can be treated analytically; as a result, the Monte … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
51
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 31 publications
(51 citation statements)
references
References 35 publications
0
51
0
Order By: Relevance
“…In [19] the authors assume a uniform prior for the number of dipoles, and use reversible-jump Markov Chain Monte Carlo (MCMC) to approximate the posterior distribution. In [39,37], the authors assume a Poisson prior for the number of dipoles, and use sequential Monte Carlo (SMC) samplers [11] to approximate the posterior distribution; as SMC samplers employ multiple Markov Chains running in parallel, they are less likely to remain trapped in local maxima.…”
Section: Bayesian Monte Carlo Methods For Static Dipolesmentioning
confidence: 99%
“…In [19] the authors assume a uniform prior for the number of dipoles, and use reversible-jump Markov Chain Monte Carlo (MCMC) to approximate the posterior distribution. In [39,37], the authors assume a Poisson prior for the number of dipoles, and use sequential Monte Carlo (SMC) samplers [11] to approximate the posterior distribution; as SMC samplers employ multiple Markov Chains running in parallel, they are less likely to remain trapped in local maxima.…”
Section: Bayesian Monte Carlo Methods For Static Dipolesmentioning
confidence: 99%
“…The next sections explain how to sample from the conditional distributions of the unknown parameters and hyperparameters associated with the posterior of interest (8). The resulting algorithm is also summarized in Algorithm 1.…”
Section: Markov Chain Monte Carlo Methodsmentioning
confidence: 99%
“…Because of the complexity of this posterior distribution, the Bayesian estimators of {θ, Φ} cannot be computed with simple closed-form expressions. Section IV studies an MCMC method that can be used to sample the joint posterior distribution (8) and build Bayesian estimators of the unknown model parameters using the generated samples.…”
Section: Posterior Distributionmentioning
confidence: 99%
See 1 more Smart Citation
“…These can be classified into two groups: (i) the dipole-fitting models that represent the brain activity as a small number of dipoles with unknown positions; and (ii) the distributed-source models that represent the brain activity as a large number of dipoles in fixed positions. Dipole-fitting models (Sommariva and Sorrentino, 2014;da Silva and Van Rotterdam, 1998) try to estimate the amplitudes, orientations and positions of a few dipoles that explain the measured data. Unfortunately, the corresponding estimators are very sensitive to the initial guess of the number of dipoles and their initial locations (Grech et al, 2008).…”
Section: Introductionmentioning
confidence: 99%