2004 International Conference on Image Processing, 2004. ICIP '04.
DOI: 10.1109/icip.2004.1419474
|View full text |Cite
|
Sign up to set email alerts
|

Robust shape based two hand tracker

Abstract: ARSTKACTThis paper presents a rohust shape-hascd on-line tracker for siinultanco~isly tracking the inotion of horh hands. that is robust to cases of background clutter. other inoving objects, occlusions of one hsid hy the other. and a wide range of illumination vxiatioiis. The tracker is hased on ail on-line predictive EigenTracking framework. Ibis framework allows efficient tracking of aniculate objects. which change in appearance across views. We show results o l successful tracking across all possible cases… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 12 publications
0
7
0
Order By: Relevance
“…7 GES. 8 1 0.98 1.6 × 10 8 5.9 × 10 7 2.7 × 10 7 9.4 × 10 7 1.0 × 10 8 8.6 × 10 7 1.6 × 10 8 2 2.1 × 10 8 0.26 2.1 × 10 7 4.4 × 10 7 1.6 × 10 9 2.9 × 10 7 1.5 × 10 7 2.1 × 10 8 3 3.0 × 10 8 5.1 × 10 7 0.88 4.6 × 10 7 1.8 × 10 9 3.8 × 10 7 5.4 × 10 7 1.7 × 10 8 4 3.2 × 10 8 6.5 × 10 7 6.0 × 10 7 0.77 1.8 × 10 9 5.4 × 10 6 6.5 × 10 7 1.6 × 10 8 5 5.9 × 10 7 1.8 × 10 8 7.4 × 10 7 6.9 × 10 7 0.95 1.1 × 10 8 9.5 × 10 7 1.3 × 10 8 6 4.2 × 10 8 4.4 × 10 7 4.7 × 10 7 4.7 × 10 6 2.2 × 10 9 0.99 7.1 × 10 7 2.9 × 10 8 7 3.6 × 10 7 8.9 × 10 7 4.5 × 10 7 1.0 × 10 8 6.8 × 10 8 1.3 × 10 8 0.88 2.6 × 10 8 8 7.7 × 10 7 9.2 × 10 7 1.3 × 10 8 4.2 × 10 7 6.2 × 10 8 6.5 × 10 7 1.2 × 10 8 0.97 Patwardhan, Dutta Roy, Chaudhuri, and Chaudhury, 2004), which models all possible cases of hand-hand interactions.…”
Section: Discussionmentioning
confidence: 99%
“…7 GES. 8 1 0.98 1.6 × 10 8 5.9 × 10 7 2.7 × 10 7 9.4 × 10 7 1.0 × 10 8 8.6 × 10 7 1.6 × 10 8 2 2.1 × 10 8 0.26 2.1 × 10 7 4.4 × 10 7 1.6 × 10 9 2.9 × 10 7 1.5 × 10 7 2.1 × 10 8 3 3.0 × 10 8 5.1 × 10 7 0.88 4.6 × 10 7 1.8 × 10 9 3.8 × 10 7 5.4 × 10 7 1.7 × 10 8 4 3.2 × 10 8 6.5 × 10 7 6.0 × 10 7 0.77 1.8 × 10 9 5.4 × 10 6 6.5 × 10 7 1.6 × 10 8 5 5.9 × 10 7 1.8 × 10 8 7.4 × 10 7 6.9 × 10 7 0.95 1.1 × 10 8 9.5 × 10 7 1.3 × 10 8 6 4.2 × 10 8 4.4 × 10 7 4.7 × 10 7 4.7 × 10 6 2.2 × 10 9 0.99 7.1 × 10 7 2.9 × 10 8 7 3.6 × 10 7 8.9 × 10 7 4.5 × 10 7 1.0 × 10 8 6.8 × 10 8 1.3 × 10 8 0.88 2.6 × 10 8 8 7.7 × 10 7 9.2 × 10 7 1.3 × 10 8 4.2 × 10 7 6.2 × 10 8 6.5 × 10 7 1.2 × 10 8 0.97 Patwardhan, Dutta Roy, Chaudhuri, and Chaudhury, 2004), which models all possible cases of hand-hand interactions.…”
Section: Discussionmentioning
confidence: 99%
“…We also address the problem of choosing a gesture set that models the upper bound on gesture recognition. Further extensions of this work include applying this framework to two-handed gestures, possibly using our robust two-hand tracker (Barhate et al, 2004). (This models all possible cases of hand-hand interactions.)…”
Section: Conclusion and Scope For Future Workmentioning
confidence: 99%
“…Barhate et al [3] apply the ideas of the EigenTracking framework instead. The main disadvantage of this method is its limited robustness towards background clutter, which is essential in multiuser scenarios.…”
Section: Hand Trackingmentioning
confidence: 99%
“…Using another approach, Barhate et al [3] integrated a second motion cue into the detection framework, where color and motion are used to initialize an EigenTracker.…”
Section: Hand Detectionmentioning
confidence: 99%
See 1 more Smart Citation