2016
DOI: 10.3150/14-bej666
|View full text |Cite
|
Sign up to set email alerts
|

On the role of interaction in sequential Monte Carlo algorithms

Abstract: We introduce a general form of sequential Monte Carlo algorithm defined in terms of a parameterized resampling mechanism. We find that a suitably generalized notion of the Effective Sample Size (ESS), widely used to monitor algorithm degeneracy, appears naturally in a study of its convergence properties. We are then able to phrase sufficient conditions for time-uniform convergence in terms of algorithmic control of the ESS, in turn achievable by adaptively modulating the interaction between particles. This lea… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
98
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 46 publications
(99 citation statements)
references
References 24 publications
1
98
0
Order By: Relevance
“…This is the price to pay for sparse communication. Lastly, we observe that running parallel PFs without communication is not stable as stated in [2]. If the nodes do not communicate with each other, eventually one node dominates the others by holding all the weights and ESS decreases below N/M .…”
Section: Effective Sample Sizementioning
confidence: 85%
See 2 more Smart Citations
“…This is the price to pay for sparse communication. Lastly, we observe that running parallel PFs without communication is not stable as stated in [2]. If the nodes do not communicate with each other, eventually one node dominates the others by holding all the weights and ESS decreases below N/M .…”
Section: Effective Sample Sizementioning
confidence: 85%
“…However in the resampling stage, all the particles must interact with each other which causes communication overhead. Although resampling is inevitable for the stability of PFs [2], it is evidently the bottleneck in the distributed implementation of particle filters. Therefore, the main consideration is to minimize the communication between the nodes in designing distributed resampling algorithms, otherwise the communication overhead prevents PFs from speeding up.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we overview relevant aspects of the general methodology proposed in . An HMM with measurable state space ()sans-serifX,scriptX and observation space ()sans-serifY,scriptY is a process {}()normalXn,Yn;n0 where {}normalXn;n0 is a Markov chain on sans-serifX, the observations {}Yn;n0, valued in sans-serifY, are conditionally independent given {}normalXn;n0, and the conditional distribution of each Y n depends on {}normalXn;n0 only through X n .…”
Section: αSmcmentioning
confidence: 99%
“…This idea has been indirectly and implicitly employed in different Monte Carlo schemes: parallel particle filters [9], [10], particle island and related methods [11]- [13], tracking and model selection algorithms [14], nested sequential Monte Carlo schemes [15], [16] are some examples.…”
Section: Introductionmentioning
confidence: 99%