Multimedia and Expo, 2007 IEEE International Conference On 2007
DOI: 10.1109/icme.2007.4285028
|View full text |Cite
|
Sign up to set email alerts
|

Automatically Tuning Background Subtraction Parameters using Particle Swarm Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
37
0
1

Year Published

2010
2010
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 64 publications
(39 citation statements)
references
References 5 publications
0
37
0
1
Order By: Relevance
“…Other measures for fitness quantification, in the context of background subtraction techniques, have been proposed in the literature (Rosin & Ioannidis, 2003;White & Shah, 2007;Ilyas et al, 2009). The following are some examples:…”
Section: Quantitative Performance Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Other measures for fitness quantification, in the context of background subtraction techniques, have been proposed in the literature (Rosin & Ioannidis, 2003;White & Shah, 2007;Ilyas et al, 2009). The following are some examples:…”
Section: Quantitative Performance Analysismentioning
confidence: 99%
“…The optimal tuning of the parameter set in this algorithm is considered not to be a trivial issue. In White & Shah (2007), an automatic tuning strategy based on particle swarm optimization is proposed. Another set of algorithms lay in the category of non-parametric algorithms.…”
mentioning
confidence: 99%
“…Another measure for fitness quantification in the context of background subtraction is F-measure (Fm) [20]- [22] which combines precision and recall to provide more representative than PR and RC themselves.…”
Section: Quantitative Performancementioning
confidence: 99%
“…Indeed, it doesnt need to label data. Recently, White et al [1] proved that the Gaussian Mixture Model (GMM) [19] gives better results when some coefficients are determined in a supervised way. Following this idea, we propose to use a supervised subspace learning for background modeling.…”
Section: Introductionmentioning
confidence: 99%
“…Their main advantage is that it doesn't need to label data during the training and running phase. Recently, White et al [1] have shown that a supervised approach can improved significantly the robustness in background modeling. Following this idea, we propose to model the background via a supervised subspace learning called Incremental Maximum Margin Criterion (IMMC).…”
mentioning
confidence: 99%