2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops 2009
DOI: 10.1109/iccvw.2009.5457447
|View full text |Cite
|
Sign up to set email alerts
|

On-line Random Forests

Abstract: Random Forests (RFs) are frequently used in many computer vision and machine learning applications. Their popularity is mainly driven by their high computational efficiency during both training and evaluation while achieving state-of-the-art results. However, in most applications RFs are used off-line. This limits their usability for many practical problems, for instance, when training data arrives sequentially or the underlying distribution is continuously changing.In this paper, we propose a novel on-line ra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
308
0
4

Year Published

2012
2012
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 442 publications
(323 citation statements)
references
References 19 publications
1
308
0
4
Order By: Relevance
“…In particular, the online RF of Saffari et al [16] trains decision trees of fixed depth and has a fixed structure which does not change with the observation of new cases once the tree depth limit is reached. We address this issue in our online RF algorithm with -The use of primed off-line learning to speed up convergence to a reasonable accuracy.…”
Section: Online Rfmentioning
confidence: 99%
See 2 more Smart Citations
“…In particular, the online RF of Saffari et al [16] trains decision trees of fixed depth and has a fixed structure which does not change with the observation of new cases once the tree depth limit is reached. We address this issue in our online RF algorithm with -The use of primed off-line learning to speed up convergence to a reasonable accuracy.…”
Section: Online Rfmentioning
confidence: 99%
“…Similar to Saffari et al [16] and different to the Hoeffding tree, a split is simply generated after observing a certain specified number of instances (40 is the default value normally leading to best performance). For each feature, Gaussian distribution is assumed and is tracked online, and a split threshold value which maximizes the Gini Index value is selected.…”
Section: Online Rfmentioning
confidence: 99%
See 1 more Smart Citation
“…The author compared TLD with some other algorithms, like IVT (Iterative Visual Tracking [8]), ODV (Online Discriminative Features [9]), ET (Ensemble Tracking [10]), MIL (Multiple Instance Learning [11]), Co-trained Generative Discriminative Tracking [12], OB (Online Boosting [13]), ORF (Online Random Forests [14]), PROST (Parallel Robust Online Simple Tracking [15]) etc. The performance is evaluated using P, R and F. P represents the number of true positives divided by the number of all responses and R represents the number of true positives divided by the number of object occurrences that should have been detected.…”
Section: Methodsmentioning
confidence: 99%
“…Usually a bounding box is propagated to later frames. Boosting based classifiers [17] and random forest classifiers [16] are popular choices for learning and updating the template model. Godec et al [10] also give a rough segmentation of the object being tracked.…”
Section: Related Workmentioning
confidence: 99%