2008
DOI: 10.1016/j.spa.2007.12.005
|View full text |Cite
|
Sign up to set email alerts
|

Optimal acceptance rates for Metropolis algorithms: Moving beyond 0.234

Abstract: Recent optimal scaling theory has produced a condition for the asymptotically optimal acceptance rate of Metropolis algorithms to be the well-known 0.234 when applied to certain multidimensional target distributions. These d-dimensional target distributions are formed of independent components, each of which is scaled according to its own function of d. We show that when the condition is not met the limiting process of the algorithm is altered, yielding an asymptotically optimal acceptance rate which might dra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
77
0
1

Year Published

2008
2008
2016
2016

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 82 publications
(82 citation statements)
references
References 15 publications
1
77
0
1
Order By: Relevance
“…In a related direction, Bédard (2007Bédard ( , 2008Bédard ( , 2006; see also Bédard and Rosenthal, 2008) considered the case where the target distribution π has independent coordinates with vastly different scalings (i.e., different powers of d as d → ∞). She proved that if each individual component is dominated by the sum of all components, then the optimal acceptance rate of 0.234 still holds.…”
Section: Inhomogeneous Target Distributionsmentioning
confidence: 99%
“…In a related direction, Bédard (2007Bédard ( , 2008Bédard ( , 2006; see also Bédard and Rosenthal, 2008) considered the case where the target distribution π has independent coordinates with vastly different scalings (i.e., different powers of d as d → ∞). She proved that if each individual component is dominated by the sum of all components, then the optimal acceptance rate of 0.234 still holds.…”
Section: Inhomogeneous Target Distributionsmentioning
confidence: 99%
“…Thus the simple rule of thumb of tuning σ d such that one in four proposed moves are accepted holds quite generally. In [4] and [17], examples where the aoar is strictly less than 0.234 are given. These correspond to different orders of magnitude being appropriate for the scaling of the proposed moves in different components.…”
mentioning
confidence: 99%
“…A number of authors have attempted to weaken and generalize the original strong assumptions; see e.g., Bédard (2007Bédard ( , 2008, Bédard and Rosenthal (2008), Beskos et al (2009), andSherlock and. Corresponding results have been developed for Langevin MCMC algorithms (Roberts and Rosenthal, 1998), and for simulated tempering algorithms (Atchadé et al, 2011;Roberts and Rosenthal, 2013).…”
Section: Optimal Scalingmentioning
confidence: 99%