Complex Medical Engineering
DOI: 10.1007/978-4-431-30962-8_33
|View full text |Cite
|
Sign up to set email alerts
|

MEG Source Localization under Multiple Constraints: An Extended Bayesian Framework

Abstract: To use Electroencephalography (EEG) and Magnetoencephalography (MEG) as functional brain 3D imaging techniques, identifiable distributed source models are required. The reconstruction of EEG/ MEG sources rests on inverting these models and is ill-posed because the solution does not depend continuously on the data and there is no unique solution in the absence of prior information or constraints. We have described a general framework that can account for several priors in a common inverse solution. An empirical… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
84
0

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 50 publications
(85 citation statements)
references
References 29 publications
1
84
0
Order By: Relevance
“…By allowing the prior precisions to be optimised as free parameters, we are effectively optimising the balance between data and priors. This is an important aspect of hierarchical Bayesian models, which we have exploited in the context of parametric empirical Bayes models previously (e.g., Mattout et al, 2006;Phillips et al, 2002). It was also used by Sato et al (2004) to implement automatic relevance determination (ARD) to 'switch off' redundant sources in an imaging context.…”
Section: A Note On Priorsmentioning
confidence: 99%
See 1 more Smart Citation
“…By allowing the prior precisions to be optimised as free parameters, we are effectively optimising the balance between data and priors. This is an important aspect of hierarchical Bayesian models, which we have exploited in the context of parametric empirical Bayes models previously (e.g., Mattout et al, 2006;Phillips et al, 2002). It was also used by Sato et al (2004) to implement automatic relevance determination (ARD) to 'switch off' redundant sources in an imaging context.…”
Section: A Note On Priorsmentioning
confidence: 99%
“…The inversion of this model amounts to a nonlinear optimization problem, because the forward model is nonlinear in dipole location (Mosher et al, 1992). Recently, the source reconstruction problem has been addressed by placing many dipoles in brain space, and using constraints on the solution to make it unique; for example (Baillet and Garnero, 1997;Mattout et al, 2006;Phillips et al, 2005). This approach is attractive, because it produces images of brain activity comparable to other imaging modalities and it eschews subjective constraints on the inversion.…”
Section: Introductionmentioning
confidence: 99%
“…Depending on different EEG source models, the fMRI map can be used to constrain the locations of multiple current dipoles, namely the fMRI-constrained dipole fitting (Ahlfors et al, 1999;Korvenoja et al, 1999;Fujimaki et al, 2002;Vanni et al, 2004), or to constrain the distributed source distribution over the folded cortical surface or in the 3-D brain volume, namely the fMRI-constrained current density imaging (George et al, 1995;Liu et al, 1998;Dale et al, 2000;Wagner et al, 2000;Babiloni et al, 2005;Ahlfors and Simpson, 2004;Sato et al, 2004;Phillips et al, 2005;Liu et al, 2006b;Mattout et al, 2006).…”
Section: Introductionmentioning
confidence: 99%
“…Existing methods for the fMRI-constrained current density imaging have been implemented under different frameworks such as Wiener estimation (Dale and Sereno, 1993;Liu et al, 1998;Dale et al, 2000), weighted minimum norm (Wagner et al, 2000;Babiloni et al, 2005;Ahlfors and Simpson et al, 2004), Bayesian estimation (Sato et al, 2004;Phillips et al, 2005;Mattout et al, 2006) and Twomey regularization (Liu et al, 2006b). …”
Section: Introductionmentioning
confidence: 99%
“…In this scenario, candidate priors are distinguished by a set of flexible hyperparameters γ that must be estimated via a variety of data-driven iterative procedures. Examples include hierarchical covariance component models [12], [24], [31], automatic relevance determination (ARD) [22], [26], [33], [34], [41], and several related variational Bayesian methods [13], [14], [27], [29], [36], [38].…”
Section: Introductionmentioning
confidence: 99%