2017 International Conference on Rehabilitation Robotics (ICORR) 2017
DOI: 10.1109/icorr.2017.8009388
|View full text |Cite
|
Sign up to set email alerts
|

Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking

Abstract: Eye-movements are the only directly observable behavioural signals that are highly correlated with actions at the task level, and proactive of body movements and thus reflect action intentions. Moreover, eye movements are preserved in many movement disorders leading to paralysis (or amputees) from stroke, spinal cord injury, Parkinson's disease, multiple sclerosis, and muscular dystrophy among others. Despite this benefit, eye tracking is not widely used as control interface for robotic interfaces in movement … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
23
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
3
1
1

Relationship

3
6

Authors

Journals

citations
Cited by 29 publications
(23 citation statements)
references
References 24 publications
0
23
0
Order By: Relevance
“…This enables the user to trigger actions without having to interact with a display unit and thus enables them to freely look at the location of intended action, making it ideal for direct control of robotic actuators and exoskeletons. Our aim is to combine a simple "mouse click" control, here of the opening and closing of a wearable robotic hand, with our 3D eye-tracking based end-point control of robotic actuators or exoskeletons [12], such as robotic arm support systems [16]. Here, a single wearable set of binocular eye-tracking glasses can provide both command execution trigger commands ("clicks") as well as high resolution endpoint control in 3D.…”
Section: Introductionmentioning
confidence: 99%
“…This enables the user to trigger actions without having to interact with a display unit and thus enables them to freely look at the location of intended action, making it ideal for direct control of robotic actuators and exoskeletons. Our aim is to combine a simple "mouse click" control, here of the opening and closing of a wearable robotic hand, with our 3D eye-tracking based end-point control of robotic actuators or exoskeletons [12], such as robotic arm support systems [16]. Here, a single wearable set of binocular eye-tracking glasses can provide both command execution trigger commands ("clicks") as well as high resolution endpoint control in 3D.…”
Section: Introductionmentioning
confidence: 99%
“…These results sit at one end of the spectrum of solutions for controlling an augmentative device, which goes from substitution all the way to direct augmentation via higher level control, either brain-machine-interfacing or cognitive interfaces such as eye gaze decoding. We previously showed that the end-point of visual attention (where one looks) can control the spatial end-point of a robotic actuator with centimetre-level precision (Tostado et al, 2016;Maimon-Mor et al, 2017;Shafti et al, 2019b). This direct control modality is more effective from a user perspective than voice or neuromuscular signals as a natural control interface (Noronha et al, 2017).…”
Section: Discussionmentioning
confidence: 99%
“…However, to eliminate the need for constant interaction with the interface, advanced navigation techniques mediated by inputs derived from natural user behaviour is the optimal solution for this problem [11]. Taking into account of the 3D endpoint decoding advances [4,[12][13][14][15][16] here, we take a Human-in-the-AI *This work was supported by EPSRC (EP/N509486/1) and a Toyota Mobility Foundation Discovery Prize. We acknowledge the support of our end-user volunteers TN and PM.…”
Section: Introductionmentioning
confidence: 99%