2021
DOI: 10.1214/20-ba1242
|View full text |Cite
|
Sign up to set email alerts
|

Multilevel Linear Models, Gibbs Samplers and Multigrid Decompositions (with Discussion)

Abstract: We study the convergence properties of the Gibbs Sampler in the context of posterior distributions arising from Bayesian analysis of conditionally Gaussian hierarchical models. We develop a multigrid approach to derive analytic expressions for the convergence rates of the algorithm for various widely used model structures, including nested and crossed random effects. Our results apply to multilevel models with an arbitrary number of layers in the hierarchy, while most previous work was limited to the two-level… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 40 publications
0
7
0
Order By: Relevance
“…These first of these is implementing constraints on the and parameters to improve model convergence. While this may seem counterintuitive, such identifiability constraints have been shown to dramatically improve convergence of Gibbs samplers for Gaussian mixture models ( Zanella and Roberts 2021 ). The other changes included updating and with fixed calculations and editing the underlying code to improve computational efficiency.…”
Section: Methodsmentioning
confidence: 99%
“…These first of these is implementing constraints on the and parameters to improve model convergence. While this may seem counterintuitive, such identifiability constraints have been shown to dramatically improve convergence of Gibbs samplers for Gaussian mixture models ( Zanella and Roberts 2021 ). The other changes included updating and with fixed calculations and editing the underlying code to improve computational efficiency.…”
Section: Methodsmentioning
confidence: 99%
“…Posterior approximation via gradient-based MCMC methods encounter issues when sampling from non-differentiable distributions, such as the double expo-nential (Brooks et al, 2011). Additionally, it is straighforward to reparametrize the posterior distribution to improve the rates of convergence of either Gibbs or Hamiltonian Monte Carlo based MCMC samplers in the context of conditionally normal multilevel models (Bürkner, 2017;Zanella and Roberts, 2021).…”
Section: Derivation Of the R2-d2-m2 Priormentioning
confidence: 99%
“…Nevertheless, from the point of view of this article it is worth considering their impact on algorithmic performance. Theorem 6 of Zanella and Roberts (2020) derives the relaxation times of the vanilla Gibbs sampler for crossed effect models with Gaussian likelihood. Some constraints can make the vanilla Gibbs sampler scalable, whereas others cannot.…”
Section: Related Literature and Alternative Approachesmentioning
confidence: 99%
“…Both such extensions are non-trivial and would lead to a significant generalization of the result. In order to consider unbalanced designs, one needs to extend proof techniques based on multigrid decompositions (Zanella and Roberts, 2020;Papaspiliopoulos et al, 2020), which exploit the exact independence between appropriate reparametrizations of θ, to cases of weak dependence. In this direction, the proof strategy of Ghosh et al (2020) can be interesting, which exploits appropriate concentration inequalities to provide upper bounds on the convergence rate of coordinate-wise optimization for computing the MAP estimator for crossed effects with unbalanced levels.…”
Section: Random Graphs Weak Dependence and Complexitymentioning
confidence: 99%