2019
DOI: 10.1137/17m1125340
|View full text |Cite
|
Sign up to set email alerts
|

How to Avoid the Curse of Dimensionality: Scalability of Particle Filters with and without Importance Weights

Abstract: Particle filters are a popular and flexible class of numerical algorithms to solve a large class of nonlinear filtering problems. However, standard particle filters with importance weights have been shown to require a sample size that increases exponentially with the dimension D of the state space in order to achieve a certain performance, which precludes their use in very high-dimensional filtering problems. Here, we focus on the dynamic aspect of this curse of dimensionality (COD) in continuous time filterin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
43
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 42 publications
(43 citation statements)
references
References 27 publications
(59 reference statements)
0
43
0
Order By: Relevance
“…This makes it possible to leverage existing and future approaches to gain estimation in the FPF. As an unweighted filter, the ppFPF is expected to scale to highdimensional problems [3]. Fig 2).…”
Section: Discussionmentioning
confidence: 99%
“…This makes it possible to leverage existing and future approaches to gain estimation in the FPF. As an unweighted filter, the ppFPF is expected to scale to highdimensional problems [3]. Fig 2).…”
Section: Discussionmentioning
confidence: 99%
“…by resampling the particles from the weight distribution and resetting the weights to 1/M . The time scale on which this weight degeneracy happens depends on the number of observable dimensions, in other words it is accelerated as the dimensionality of the system is increased (Surace et al, 2019b). This is a form of the so-called 'curse of dimensionality', a common nuisance in weighted particle filters.…”
Section: Particle Filtering In Continuous Timementioning
confidence: 99%
“…Unweighted particle filters therefore hold the promise of avoiding the curse of dimensionality (see Surace et al, 2019b).…”
Section: Particle Filtering In Continuous Timementioning
confidence: 99%
“…The issue has become even more severe in the era of big data where the dimensionality of the processes is very large. The main approach to approximate the conditional distribution, sequential Monte-Carlo or particle filtering (see [2] for a survey and pointers to the literature), is known to exhibit a curse of dimensionality as the number of dimensions of the observations grows [3]- [6]. The problem can be traced down to the use of importance weights and their increasing degeneracy as time progresses (see [6] and the references therein).…”
Section: Introductionmentioning
confidence: 99%