2015
DOI: 10.1214/14-aap1061
|View full text |Cite
|
Sign up to set email alerts
|

Can local particle filters beat the curse of dimensionality?

Abstract: The discovery of particle filtering methods has enabled the use of nonlinear filtering in a wide array of applications. Unfortunately, the approximation error of particle filters typically grows exponentially in the dimension of the underlying model. This phenomenon has rendered particle filters of limited use in complex data assimilation problems. In this paper, we argue that it is often possible, at least in principle, to develop local particle filtering algorithms whose approximation error is dimension-free… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

6
262
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 182 publications
(269 citation statements)
references
References 20 publications
6
262
0
Order By: Relevance
“…Standard sequential Monte Carlo (SMC) methods fail to track high dimensional systems due to exponentially degenerate importance weights. However, while there have only been some suggestions for a solution to this problem in SMCs, such as in [18], one can alter the above localization scheme in the ETPF to solve this problem swiftly [5]. It is also needed to reduce the computational cost of likelihood evaluations when the dimension of the state space is greater than the sample size…”
Section: The Multilevel Monte Carlo Methodmentioning
confidence: 99%
“…Standard sequential Monte Carlo (SMC) methods fail to track high dimensional systems due to exponentially degenerate importance weights. However, while there have only been some suggestions for a solution to this problem in SMCs, such as in [18], one can alter the above localization scheme in the ETPF to solve this problem swiftly [5]. It is also needed to reduce the computational cost of likelihood evaluations when the dimension of the state space is greater than the sample size…”
Section: The Multilevel Monte Carlo Methodmentioning
confidence: 99%
“…In the following examples, the proposed sequential Langevin and Hamiltonian based MCMC algorithms will be compared to three different variants of SMC-based algorithms: standard SIR algorithm, the block SIR [6] (with a block size of 4) and a Resample-Move algorithm, denoted by SIR-RMK, for which K MCMC moves with the mHMC kernel described in Section IV-B is applied on each particle (for x n -i.e. L = 1) after the resampling stage.…”
Section: Numerical Simulations: Large Spatial Sensor Networkmentioning
confidence: 99%
“…Secondly, we can cite the Block Sequential Importance Resampling (Block SIR) approach in which the underlying idea is to partition the state space into separate subspaces of small dimensions and run one SMC algorithm on each subspace [6], [9]- [11]. However, this strategy introduces in the final estimates a difficult to quantify bias which depends on the position along the split state vector elements.…”
mentioning
confidence: 99%
“…These methods are a good-choice from a theoretical point of view. Unfortunately, in practice, PFs do suffer from a degeneracy problem [33] (Section 1.4) and even more, many challenges have to be overcome before they can be considered under operational DA scenarios [34,35]. For these reasons, PFs are not considered any further in this paper.…”
Section: Gaussian Mixture Models Based Filtersmentioning
confidence: 99%