2017
DOI: 10.3150/14-bej680
|View full text |Cite
|
Sign up to set email alerts
|

Nonasymptotic analysis of adaptive and annealed Feynman–Kac particle models

Abstract: Sequential and quantum Monte Carlo methods, as well as genetic type search algorithms can be interpreted as a mean field and interacting particle approximations of Feynman-Kac models in distribution spaces. The performance of these population Monte Carlo algorithms is strongly related to the stability properties of nonlinear Feynman-Kac semigroups. In this paper, we analyze these models in terms of Dobrushin ergodic coefficients of the reference Markov transitions and the oscillations of the potential function… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 36 publications
0
10
0
Order By: Relevance
“…, ⌈Nα⌉} are i.i.d. random vectors (see equation (5.1) page 22), and that with similar arguments they are independent of the subsample strictly below L N q , it is clear that (Z N j ) 0≤j≤(p+1)(⌈N α⌉+1)−1 is a triangular array of martingale increments adapted to the filtration J . It is then straightforward to check that…”
Section: Proof Of Theorem 32mentioning
confidence: 83%
See 1 more Smart Citation
“…, ⌈Nα⌉} are i.i.d. random vectors (see equation (5.1) page 22), and that with similar arguments they are independent of the subsample strictly below L N q , it is clear that (Z N j ) 0≤j≤(p+1)(⌈N α⌉+1)−1 is a triangular array of martingale increments adapted to the filtration J . It is then straightforward to check that…”
Section: Proof Of Theorem 32mentioning
confidence: 83%
“…As a consequence, even if the global goal here is roughly the same as in [16,3], the techniques developed for establishing our convergence results are quite different. Note also that in the context of adaptive tempering (a context considered in [3]), Giraud and Del Moral give non-asymptotic bounds on the error in [22].…”
Section: Introductionmentioning
confidence: 99%
“…To construct the particle approximation η N n , the practical SMC algorithm exploits summary statistics ξ n : E n−1 → R d by reweighing and propagating the particle approximation η N n−1 through the potential G n,η N n−1 (ξn) and the Markov kernel M n,η N n−1 (ξn) . This is a substitute for the perfect algorithm (as also used by [16] and which cannot be implemented) which employs the Markov kernel M n,ηn−1(ξn) and weight function G n,ηn−1(ξn) . We prove a WLLN and a CLT for both the approximation of the probability distribution η n and its normalising constant.…”
Section: Results and Structurementioning
confidence: 99%
“…Some preliminary results can be found, under exceptionally strong conditions, in [8,17]. Proof sketches are given in [12] with some more realistic but limited analysis in [16]. We are not aware of any other asymptotic analysis of these particular class of algorithms in the literature.…”
Section: Introductionmentioning
confidence: 99%
“…In Algorithm 2 we use m iterations of (20) with Q n,l specified as above. The corresponding m-iterate of the MCMC transition kernel is denoted as K m n,l and is presented in Algorithm 3 in an algorithmic form.…”
Section: Adding Particle Diversity With Mcmc Kernelsmentioning
confidence: 99%