2008
DOI: 10.1007/s11222-008-9070-2
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive independence samplers

Abstract: Markov chain Monte Carlo (MCMC) is an important computational technique for generating samples from non-standard probability distributions. A major challenge in the design of practical MCMC samplers is to achieve efficient convergence and mixing properties. One way to accelerate convergence and mixing is to adapt the proposal distribution in light of previously sampled points, thus increasing the probability of acceptance. In this paper, we propose two new adaptive MCMC algorithms based on the Independent Metr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
22
0

Year Published

2008
2008
2013
2013

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 27 publications
(22 citation statements)
references
References 32 publications
0
22
0
Order By: Relevance
“…We adopt the so-called cross-entropy adaptive independence sampler introduced in Keith, Kroese, and Sofronov (2008). Speci…cally, the proposal density is chosen such that the Kullback-Leibler divergence, or the cross-entropy (CE)distance between the proposal density and the target (the posterior density) is minimal, where the CE distance between the densities g 1 and g 2 is de…ned as:…”
Section: Collapsed Sampling With the Cross-entropy Methodsmentioning
confidence: 99%
“…We adopt the so-called cross-entropy adaptive independence sampler introduced in Keith, Kroese, and Sofronov (2008). Speci…cally, the proposal density is chosen such that the Kullback-Leibler divergence, or the cross-entropy (CE)distance between the proposal density and the target (the posterior density) is minimal, where the CE distance between the densities g 1 and g 2 is de…ned as:…”
Section: Collapsed Sampling With the Cross-entropy Methodsmentioning
confidence: 99%
“…Other mixture from the exponential family can also be considered, such as a discrete/continuous mixture of Student's t-distribution (see Andrieu and Thoms (2008) for details). Andrieu and Moulines (2006) suggest to fit these parameters using a maximum likelihood approach or, equivalently, by maximizing the cross-entropy between the proposal distribution and the target; this algorithm shares some similarities with the so-called adaptive independence sampler developed in Keith et al (2008). In this framework, the parameters are fitted using a sequential version of the EM algorithm (Cappe and Figure 1.5: Adaptive fit of a mixture of three Gaussian distributions with arbitrary means and covariance using the maximum likelihood approach developed in Andrieu and Moulines (2006) Moulines, 2009) (several improvements on this basic schemes are presented in Andrieu and Thoms (2008)).…”
Section: Internal Adaptive Mcmcmentioning
confidence: 99%
“…Keith et al (2008) developed adaptive independence samplers by minimizing the Kullback-Leibler (KL) divergence in order to provide the best candidate density, which consists of a mixture of Gaussian densities. The minimization of the KL-divergence is done by applying the EM algorithm of Dempster et al (1977) and the number of mixture components is selected through information criteria like AIC (Akaike (1974)), BIC (Schwarz (1978)) or DIC (Gelman et al (2003)).…”
Section: Introductionmentioning
confidence: 99%