2019 24th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA) 2019
DOI: 10.1109/etfa.2019.8869270
|View full text |Cite
|
Sign up to set email alerts
|

Gaze-based Human Factors Measurements for the Evaluation of Intuitive Human-Robot Collaboration in Real-time

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 13 publications
0
10
0
Order By: Relevance
“…Understand and exploit the peculiarities of Human AI interactions (see Fuchs and Belardinelli , Czeszumski et al , and Paletta et al, 2019 ).…”
Section: Discussionmentioning
confidence: 99%
“…Understand and exploit the peculiarities of Human AI interactions (see Fuchs and Belardinelli , Czeszumski et al , and Paletta et al, 2019 ).…”
Section: Discussionmentioning
confidence: 99%
“…Vogel et al [ 95 ], Paletta et al [ 103 ], Kyjanek et al [ 104 ] and Chan et al [ 88 ] made use of Kuka collaborative robots. Vogel et al [ 95 ] uses a Kuka iiwa LBR 14 for assisting the operator to screw mounting plates to a ground plate in a cooperative assembly task, while Paletta et al [ 103 ] uses a Kuka iiwa LBR 7 on a toy problem, where the cobot assists the operator on a pick and place task, where the goal is to assemble a tangram puzzle (toy problem). Kyjanek et al [ 104 ] employs a Kuka cobot to support the human operator on a non-standard wooden construction system, where the cobot position and holds the part in the correct location while the operator attaches them.…”
Section: Systematic Literature Review Analysismentioning
confidence: 99%
“…Liu and Wang [ 94 ] uses a depth sensor for monitoring the distance between the robot and the operator, but no AR cues are displayed to the user. Paletta et al [ 103 ] tracks the user head and hands with a motion capturing system and computes the distance to the nearest robot part; based on that and on the participant’s stress level, the robot arm’s speed is re-adjusted downwards to a potential full stop. In addition, Dimitropoulos et al [ 114 ] uses the vibration feature of a smartwatch as cues for safety alerts.…”
Section: Systematic Literature Review Analysismentioning
confidence: 99%
See 2 more Smart Citations