2015
DOI: 10.1109/tpami.2015.2392754
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating Particle Filter Using Randomized Multiscale and Fast Multipole Type Methods

Abstract: Abstract-Particle filter is a powerful method that tracks the state of a target based on non-linear observations. We present a multiscale based method that accelerates the computation of particle filters. Unlike the conventional way, which calculates weights over all particles in each cycle of the algorithm, we sample a small subset from the source particles using matrix decomposition methods. Then, we apply a function extension algorithm that uses a particle subset to recover the density function for all the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 34 publications
0
5
0
Order By: Relevance
“…Efficient matrix decomposition serves as a basis for many studies and algorithms design for data analysis and applications. Fast randomized matrix decomposition algorithms are used for tracking objects in videos [39], multiscale extensions for data [4] and detecting anomalies in network traffic for finding cyber attacks [13], to name some. There are randomized versions for many different matrix factorization algorithms [24], compressed sensing [16] and least squares [3].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Efficient matrix decomposition serves as a basis for many studies and algorithms design for data analysis and applications. Fast randomized matrix decomposition algorithms are used for tracking objects in videos [39], multiscale extensions for data [4] and detecting anomalies in network traffic for finding cyber attacks [13], to name some. There are randomized versions for many different matrix factorization algorithms [24], compressed sensing [16] and least squares [3].…”
Section: Related Workmentioning
confidence: 99%
“…As the size of the data grows exponentially, feasible methods for the analysis of large datasets has gained an increasing interest. Such an analysis can involve a factorization step of the input data given as a large sample-by-feature matrix or given by a sample affinity matrix [45,12,39]. High memory consumption and the computational complexity of the factorization step are two main reasons for the difficulties in analyzing huge data structures.…”
Section: Introductionmentioning
confidence: 99%
“…The core idea of particle filter (Wang et al , 2014; Shabat et al , 2015) is to use a cluster of weighted random samples true{bold-italicnormalXbold-italicnormalkbold-italicnormali,wkitrue}i=1N (also known as particles) to approximate the target posterior probability distribution p ( X k | Z 1:k ), and then the target state is extracted as that: where X k represents the target state at time k , bold-italicnormalXbold-italicnormalkbold-italicnormali represents the target state of the i -th particle, Z 1:k represents the observation from the initial time to the current time k , N is the number of particles and wki is the weight of the particle at the time k , which can be computed as that: where ptrue(bold-italicnormalXbold-italicnormalkbold-italicnormali|bold-italicnormalXk-1bold-italicnormalitrue) is the target translation model. If the target state translation follows the Markov process, the above equation can be deformed as: where bold-italicnormalZbold-italicnormalkbold-italicnormali is the observation of the i -th the particle at the time k and ptrue(bold-italicnormalZbold-italicnormalkbold-italicnormali|bold-i...…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Other ways to factoring computation exist, e.g. [6,10], and a hierarchy of feature encodings can be used [13]. However, to the best of our knowledge, no prior method allows tracking a process on multiple scales at once and accepts evidence with variable resolutions.…”
Section: Introductionmentioning
confidence: 99%