2018
DOI: 10.3390/sym10120680
|View full text |Cite
|
Sign up to set email alerts
|

Symmetric Evaluation of Multimodal Human–Robot Interaction with Gaze and Standard Control

Abstract: Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(12 citation statements)
references
References 31 publications
0
12
0
Order By: Relevance
“…In the experiment by Jones et al (2018) , the robot was used to play chess at different levels of difficulty. The two input modalities, eye tracking and joystick, were compared.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…In the experiment by Jones et al (2018) , the robot was used to play chess at different levels of difficulty. The two input modalities, eye tracking and joystick, were compared.…”
Section: Resultsmentioning
confidence: 99%
“…In this AR environment, colored rectangles were presented as visual feedback to help sort colored objects. Others were visualizations of virtual button presses ( Kim et al, 2001 ), tactile feedback in a multimodal shoulder control ( Bien et al, 2004 ), and highlighting as well as text descriptions ( Iáñez et al, 2010 ; McMullen et al, 2014 ; Jones et al, 2018 ), as shown in Figure 3 .…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…[9][10][11][12] Fruitful results of eye-tracking-based CAI systems for ALS patients have been obtained. [13][14][15][16][17][18] For example, scientists at Technical University of Denmark presented a robotic teleoperation system based on gaze interaction, in which users can utilize gaze to remotely control the telerobot for achieving the path planning of maze map. [17] An eye-control-based wheelchair system is developed by researchers from the University of New York, in which eye movement is interpreted as control information that used for performing functions such as navigation and object recognition.…”
Section: Introductionmentioning
confidence: 99%
“…To do this, the authors used the dwell time interaction which distinguishes the interaction gaze from observing gaze by the time the user gazes on an object or area. Jones et al [6] conducted a study that compared the performance eye gaze tracker with that of two other methods in control of robot arms. Some other researchers used gaze for controlling the view when looking at the edges of the screen or virtual arrow buttons [10,11].…”
Section: Introductionmentioning
confidence: 99%