2011
DOI: 10.1198/jasa.2011.tm10465
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Inference for General Gaussian Graphical Models With Application to Multivariate Lattice Data

Abstract: We introduce efficient Markov chain Monte Carlo methods for inference and model determination in multivariate and matrix-variate Gaussian graphical models. Our framework is based on the G-Wishart prior for the precision matrix associated with graphs that can be decomposable or non-decomposable. We extend our sampling algorithms to a novel class of conditionally autoregressive models for sparse estimation in multivariate lattice data, with a special emphasis on the analysis of spatial data. These models embed a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
138
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 102 publications
(138 citation statements)
references
References 61 publications
(153 reference statements)
0
138
0
Order By: Relevance
“…Bayesian approaches to graphical models which enforce exact zeros in the precision matrix have been proposed by Roverato (2002), Jones et al (2005), and Dobra et al (2011). In Bayesian analysis of multivariate normal data, the standard conjugate prior for the precision matrix Ω is the Wishart distribution.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Bayesian approaches to graphical models which enforce exact zeros in the precision matrix have been proposed by Roverato (2002), Jones et al (2005), and Dobra et al (2011). In Bayesian analysis of multivariate normal data, the standard conjugate prior for the precision matrix Ω is the Wishart distribution.…”
Section: Introductionmentioning
confidence: 99%
“…Jones et al (2005) and Lenkoski and Dobra (2011) simplify the problem by integrating out the precision matrix. Dobra et al (2011) propose a reversible jump algorithm to sample over the joint space of graphs and precision matrices that does not scale well to large graphs. Wang and Li (2012) propose a sampler which does not require proposal tuning and circumvents computation of the prior normalizing constant through the use of the exchange algorithm, improving both the accuracy and efficiency of computation.…”
Section: Introductionmentioning
confidence: 99%
“…The scope of the modeling framework could be extended to multivariate spatial data, which may or may not have the same support at each site. It would be equivalent to applying the general Gaussian graphical model of Dobra et al [22] to parameters instead of data in a hierarchical model. Such a framework would then enable jointly modeling of the count data and binary data in our snail abundance application.…”
Section: Parametersmentioning
confidence: 99%
“…Berrocal et al [21] used it with binary and gamma margins, respectively, to construct random fields for precipitation occurrence and precipitation amount. For lattice or areal data, which is the context of our application, the TGRF or TGMRF is related to the general Gaussian graphical model of Dobra et al [22], except that they make inferences about the graph structure. We apply the TGMRF to model parameters that are continuous, to avoid the complexities associated with discrete data [23]; this is in contrast to the existing works where it is applied to the data directly.…”
Section: Introductionmentioning
confidence: 99%
“…Given the observations { x ( j ) : ξ j = l } that currently belong to cluster l , we update Kl based on the G-Wishart distribution (4.3) by employing the Metropolis-Hastings algorithm of Dobra et al [17]. Given the updated Kl, we can update the graph Gl given Kl is performed using the reversible jump Markov chain method of Dobra et al [17] instead of using equation (4.2). Once an edge is changed in Gl, the corresponding element of Kl must also be updated as it either becomes constrained to zero (if the edge is deleted) or becomes free (if the edge is added).…”
Section: Posterior Inference For Mixtures Of Gaussian Graphical Momentioning
confidence: 99%