2019
DOI: 10.1080/01621459.2019.1585255
|View full text |Cite
|
Sign up to set email alerts
|

Informed Proposals for Local MCMC in Discrete Spaces

Abstract: There is a lack of methodological results to design efficient Markov chain Monte Carlo (MCMC) algorithms for statistical models with discrete-valued high-dimensional parameters. Motivated by this consideration, we propose a simple framework for the design of informed MCMC proposals (i.e. Metropolis-Hastings proposal distributions that appropriately incorporate local information about the target) which is naturally applicable to both discrete and continuous spaces. We explicitly characterize the class of optima… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
114
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 71 publications
(117 citation statements)
references
References 42 publications
3
114
0
Order By: Relevance
“…The choice of the number of variables to select would then be related to a cost per iteration versus mixing trade‐off. See section 6.4 of Zanella () for a discussion of similar blockwise implementations. Also, computing p i ( x ) exactly may be infeasible in some contexts, and thus it would be interesting to design a version of TGS where the terms p i ( x ) are replaced by unbiased estimators while preserving the correct invariant distribution.…”
Section: Discussionmentioning
confidence: 99%
“…The choice of the number of variables to select would then be related to a cost per iteration versus mixing trade‐off. See section 6.4 of Zanella () for a discussion of similar blockwise implementations. Also, computing p i ( x ) exactly may be infeasible in some contexts, and thus it would be interesting to design a version of TGS where the terms p i ( x ) are replaced by unbiased estimators while preserving the correct invariant distribution.…”
Section: Discussionmentioning
confidence: 99%
“…The resulting proposal distributions are called globally-balanced in Zanella (2020). To understand why, consider the situation where the size of K is small and it is feasible to switch from any model to any other one, and thus we can set N(k) = K for all k.…”
Section: Specification Of the Model Proposal Distributionsmentioning
confidence: 99%
“…We recommend to set h to the identity only when it is to feasible to set N(k) = K for all k. Otherwise, it seems better to use locallybalanced proposal distributions; this is the case for instance in Zanella (2020) and our moderate-size (both in n and dimension) variable-selection example. Two choices of locally-balanced functions h are considered in Zanella (2020): h(x) = √ x and h(x) = x/(1 + x).…”
Section: Specification Of the Model Proposal Distributionsmentioning
confidence: 99%
See 1 more Smart Citation
“…This result is achieved by leveraging the efficiency of an approximate multi-object filtering method. The use of MCMC in discrete spaces is discussed more generally in Zanella (2019). MCMC has also been used in conjunction with, or as a replacement of, sequential Monte Carlo in the context of filtering for single-object systems (Gilks and Berzuini 2001) and multi-object systems (Khan et al 2005;Septier et al 2009;Carmi et al 2012;Maroulas and Stinis 2012;Bao and Maroulas 2017); however, this type of approach is less directly related to the method proposed in this article.…”
Section: Introductionmentioning
confidence: 99%