2012
DOI: 10.1016/j.proeng.2012.01.441
|View full text |Cite
|
Sign up to set email alerts
|

A Vision Based Method to Distinguish and Recognize Static and Dynamic Gesture

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 1 publication
0
3
0
Order By: Relevance
“…From this standpoint, Zhao et al [8] introduced a vision-based system, by which both dynamic and static gesture can be interpreted. Their work is based on Haar-like [19] features and ada boost [20] classifier.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…From this standpoint, Zhao et al [8] introduced a vision-based system, by which both dynamic and static gesture can be interpreted. Their work is based on Haar-like [19] features and ada boost [20] classifier.…”
Section: Related Workmentioning
confidence: 99%
“…The former is for the recognition of a particular type of hand pose in an image or in a video sequence while the hand has remained immobile for a period of time. The latter focuses on the hand position in an image sequence and determines the meaning of gesture by analyzing different hand poses in frames sequence [8]. Dynamic gesture contains spatiotemporal information of a static gesture sequence.…”
Section: Introductionmentioning
confidence: 99%
“…This trend has become even more famous with the development in daily technology and intelligent computing (Sim玫es et al 2015). Basically, hand gesture recognition is separated into two categories; data-gloves based approaches (Oz and Leu 2011;De Marsico et al 2014) and computervision based approaches (Zhao et al 2012). The first type of approaches relies on extra hardware sensors linked to hand region to identify hand shape and trajectories, thus providing hand and fingers locations.…”
Section: Introductionmentioning
confidence: 99%