2012 12th International Conference on Intelligent Systems Design and Applications (ISDA) 2012
DOI: 10.1109/isda.2012.6416668
|View full text |Cite
|
Sign up to set email alerts
|

Image-based gesture recognition for user interaction with mobile companion-based assistance systems

Abstract: In this paper, we present image-based methods for robust recognition of static and dynamic hand gestures in real-time. These methods are used for an intuitive interaction with an assistance-system in which the skin-tones are used to segment the hands. The segmentation builds the basis of feature extraction for the static and dynamic gestures. In the static gestures, the activation of particular region leads us to associated actions whereas HMM classifier is used to extract the dynamic gestures dependent upon t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2016
2016
2016
2016

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…Nowadays, hand gestures are usually dynamic, which means they do not consist of static pointing only (Köpsel and Huckauf 2013;Villamor, Willis, and Wroblewski 2010;Wobbrock, Morris, and Wilson 2009). Gestures have been used for controlling smartphones, home electronics (Lenman, Bretzner, and Thuresson 2002;Shan 2010), factory automation (Heimonen et al 2013), humanrobot interaction (Alvarez-Santos et al 2014), multitouch surfaces (Wobbrock, Morris, and Wilson 2009) and many other electronic devices (Baudel and Beaudouin-Lafon 1993;Bhuiyan and Picking 2009;Garzotto and Valoriani 2012), even head-up displays (Saxen et al 2012). Whereas earlier systems needed utilities such as gloves (Baudel and Beaudouin-Lafon 1993) or other tracking targets (Tsukadaa and Yasumura 2002;Zimmerman et al 1987), these days no equipment attached to the body is necessary (Shan 2010).…”
Section: Previous Workmentioning
confidence: 99%
“…Nowadays, hand gestures are usually dynamic, which means they do not consist of static pointing only (Köpsel and Huckauf 2013;Villamor, Willis, and Wroblewski 2010;Wobbrock, Morris, and Wilson 2009). Gestures have been used for controlling smartphones, home electronics (Lenman, Bretzner, and Thuresson 2002;Shan 2010), factory automation (Heimonen et al 2013), humanrobot interaction (Alvarez-Santos et al 2014), multitouch surfaces (Wobbrock, Morris, and Wilson 2009) and many other electronic devices (Baudel and Beaudouin-Lafon 1993;Bhuiyan and Picking 2009;Garzotto and Valoriani 2012), even head-up displays (Saxen et al 2012). Whereas earlier systems needed utilities such as gloves (Baudel and Beaudouin-Lafon 1993) or other tracking targets (Tsukadaa and Yasumura 2002;Zimmerman et al 1987), these days no equipment attached to the body is necessary (Shan 2010).…”
Section: Previous Workmentioning
confidence: 99%