2019
DOI: 10.1109/tsp.2019.2905816
|View full text |Cite
|
Sign up to set email alerts
|

Invertible Particle-Flow-Based Sequential MCMC With Extension to Gaussian Mixture Noise Models

Abstract: Sequential state estimation in non-linear and non-Gaussian state spaces has a wide range of applications in statistics and signal processing. One of the most effective non-linear filtering approaches, particle filtering, suffers from weight degeneracy in high-dimensional filtering scenarios. Several avenues have been pursued to address high-dimensionality. Among these, particle flow particle filters construct effective proposal distributions by using invertible flow to migrate particles continuously from the p… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2019
2019
2025
2025

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(12 citation statements)
references
References 45 publications
0
12
0
Order By: Relevance
“…Reference [40] proposes to employ invertible particle flow methods to construct the first proposal distribution in the Composite MH Kernel and gave two conclusions: 1) the mapping involving 𝜂 𝑖 0 and 𝜂 𝑖 1 is invertible mapping; 2) 𝐶 is invertible and the determinant of 𝐶 is not zero. Thus, the first proposal distribution 𝑞 𝑡 ,1 (•) could be constructed by invertible particle flow mapping and follows equation (13):…”
Section: Invertible Particle Flowmentioning
confidence: 99%
“…Reference [40] proposes to employ invertible particle flow methods to construct the first proposal distribution in the Composite MH Kernel and gave two conclusions: 1) the mapping involving 𝜂 𝑖 0 and 𝜂 𝑖 1 is invertible mapping; 2) 𝐶 is invertible and the determinant of 𝐶 is not zero. Thus, the first proposal distribution 𝑞 𝑡 ,1 (•) could be constructed by invertible particle flow mapping and follows equation (13):…”
Section: Invertible Particle Flowmentioning
confidence: 99%
“…Step 1 Generate a candidate sample set by MCMC algorithm [31]. According to the function U, the prediction uncertainty increases as the U value decreases.…”
Section: B a New Strategy Of Determining The Optimal Samplementioning
confidence: 99%
“…Novel training algorithms that aim for stability [21,23] and increased efficiency [15,20] are actively being developed. Markov Chain Monte Carlo (MCMC) methods have also been employed for fitting GMMs to account for input variation [12,26,36]. Tree-based methods have also been proposed for efficient inference in k-means [18,33].…”
Section: Introductionmentioning
confidence: 99%