2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019
DOI: 10.1109/iros40897.2019.8968203
|View full text |Cite
|
Sign up to set email alerts
|

Inference of user-intention in remote robot wheelchair assistance using multimodal interfaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…However, data were collected using natural eye-hand coordination, with participants executing the task themselves, which represents a different situation from a teleoperation scenario both on a perceptual and action control level. In the context of assistive robotics, a number of other studies have also considered gaze information (at times combined with multimodal interfaces, such as BCI and haptic feedback) to operate robotic limbs and wheelchairs (Schettino and Demiris, 2019;Zeng et al, 2020). Often in these cases, the gaze is used to implicitly but actively point the system to the object the impaired user wants the robot to interact with Wang et al, 2018;Shafti et al, 2019).…”
Section: Related Workmentioning
confidence: 99%
“…However, data were collected using natural eye-hand coordination, with participants executing the task themselves, which represents a different situation from a teleoperation scenario both on a perceptual and action control level. In the context of assistive robotics, a number of other studies have also considered gaze information (at times combined with multimodal interfaces, such as BCI and haptic feedback) to operate robotic limbs and wheelchairs (Schettino and Demiris, 2019;Zeng et al, 2020). Often in these cases, the gaze is used to implicitly but actively point the system to the object the impaired user wants the robot to interact with Wang et al, 2018;Shafti et al, 2019).…”
Section: Related Workmentioning
confidence: 99%
“…Manipulators that recognize the intention of the user’s movement are also presented to make controlling the wheelchair easier [ 43 ]. Even control devices that implement haptic [ 44 ] and visual [ 45 ] feedback are given. Unfortunately, this technology is not designed to rehabilitate this kind of population but is limited to assisting.…”
Section: Related Workmentioning
confidence: 99%
“…[5] shows how touch and tangible interfaces could be used to facilitate the remote control of robots in hazardous situations. And [6] showed how multimodal interfaces based on gaze estimation, SLAM, and haptic feedback can be integrated to help a remote assistant infer the intention of a wheelchair driver.…”
Section: Introductionmentioning
confidence: 99%