2019 20th International Carpathian Control Conference (ICCC) 2019
DOI: 10.1109/carpathiancc.2019.8765982
|View full text |Cite
|
Sign up to set email alerts
|

Wheelchair control by head motion using a noncontact method in relation to the pacient

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 18 publications
0
8
0
Order By: Relevance
“…Manta et al [ 26 ] realized a wheelchair command interface based on head movements. The system has two control modes, using simple commands and head movements.…”
Section: Background and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Manta et al [ 26 ] realized a wheelchair command interface based on head movements. The system has two control modes, using simple commands and head movements.…”
Section: Background and Related Workmentioning
confidence: 99%
“…There are non-invasive instrumentation systems placed on the user, such as those based on cerebral activity, like Brain Computer Interfaces (BCIs) [ 12 , 13 , 14 ], those based on inertial and magnetic sensors that measure head or hand movements [ 15 , 16 , 17 ], and those that implement Electrooculography (EOG) as well as Electromyography (EMG) [ 18 , 19 ], [ 20 ]. On the other hand, there are controller systems placed on the wheelchair like those that involve the use of distance sensors to detect obstacles or operate the wheelchair in closed environments [ 21 , 22 , 23 ], besides those that use vision artificial techniques [ 24 , 25 , 26 ]. There are other types of instrumentation that depend on the user characteristics, the wheelchair navigation, or the environment, being outdoors or indoors.…”
Section: Introductionmentioning
confidence: 99%
“…Of course, these control devices are correlated with the level of disability of the user, the automation of the user control interface is leading implicitly to the increase of the complexity of the system and the costs. However, eye-tracking devices and facial expression recognition systems are still in development, with current models offering just a few practical facilities compared to the theoretical studies [5][6][7][8][9][10][11][12]. For example, Timofei I. et al worked on a robotic wheelchair designed for patients with severe disorders of the musculoskeletal system and other body functions.…”
Section: Introductionmentioning
confidence: 99%
“…The eye-tracking devices and facial expression recognition systems are still in development, with current models offering just a few practical facilities compared to the theoretical studies [8][9][10][11][12][13][14][15][16][17][18][19][20][21][22][23][24]. In [16], Lee Y et al propose an extended method to determine the measurement of saccadic eye movement using an eye-tracking module in a virtual reality head-mounted display.…”
Section: Introductionmentioning
confidence: 99%
“…Most of the applications presented above assume by their nature the existence of collisions and the influence of friction. e robotic structures can be controlled by people with special needs by manipulating a joystick but also by eye gaze [48], by gesture [49], by head motion [50], or by signals made with the fingers. In these applications it is absolutely necessary to seriously treat collisions between controlled robotic structures and obstacles.…”
Section: Introductionmentioning
confidence: 99%