2013
DOI: 10.1007/978-3-642-40480-1_13
|View full text |Cite
|
Sign up to set email alerts
|

Designing Gesture-Based Control for Factory Automation

Abstract: Abstract. We report the development and evaluation of a gesture-based interaction prototype for controlling the loading station of a factory automation system. In this context, gesture-based interaction has the potential to free users from the tedious physical controls but it must also account for safety considerations and users' perceptions. We evaluated the gesture interaction concept in the field to understand its applicability to industrial settings. Our findings suggest that gesture-based interaction is a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 11 publications
(9 reference statements)
0
4
0
Order By: Relevance
“…Participants were asked to perform gestures in front of a loading station simulator, which consisted of a large computer screen (40 inches) and a gesture detector (Microsoft Kinect). The simulator was created in Tampere Unit for Computer-Human Interaction (TAUCHI), and its specifics are reported by Heimonen et al (2013). Gesture-based interaction was a novel concept in the factory environment, and the participants had no experience of using gestures to control factory automation.…”
Section: Methodsmentioning
confidence: 99%
“…Participants were asked to perform gestures in front of a loading station simulator, which consisted of a large computer screen (40 inches) and a gesture detector (Microsoft Kinect). The simulator was created in Tampere Unit for Computer-Human Interaction (TAUCHI), and its specifics are reported by Heimonen et al (2013). Gesture-based interaction was a novel concept in the factory environment, and the participants had no experience of using gestures to control factory automation.…”
Section: Methodsmentioning
confidence: 99%
“…An example of modern ways of HMI is automatic speech recognition [119]. Another example is visual gesture control using expressive and meaningful body motions [123,128]. Further, we can have physical human-robot interaction as, under the perspective of the multimodal interface, physical contact may be used to create an interaction interface for CPS.…”
Section: Data Analysis: This Cluster Discusses How Informationmentioning
confidence: 99%
“…Nowadays, hand gestures are usually dynamic, which means they do not consist of static pointing only (Köpsel and Huckauf 2013;Villamor, Willis, and Wroblewski 2010;Wobbrock, Morris, and Wilson 2009). Gestures have been used for controlling smartphones, home electronics (Lenman, Bretzner, and Thuresson 2002;Shan 2010), factory automation (Heimonen et al 2013), humanrobot interaction (Alvarez-Santos et al 2014), multitouch surfaces (Wobbrock, Morris, and Wilson 2009) and many other electronic devices (Baudel and Beaudouin-Lafon 1993;Bhuiyan and Picking 2009;Garzotto and Valoriani 2012), even head-up displays (Saxen et al 2012). Whereas earlier systems needed utilities such as gloves (Baudel and Beaudouin-Lafon 1993) or other tracking targets (Tsukadaa and Yasumura 2002;Zimmerman et al 1987), these days no equipment attached to the body is necessary (Shan 2010).…”
Section: Previous Workmentioning
confidence: 98%