2016
DOI: 10.1109/jbhi.2015.2412125
|View full text |Cite
|
Sign up to set email alerts
|

Facial Position and Expression-Based Human–Computer Interface for Persons With Tetraplegia

Abstract: A human-computer interface (namely Facial position and expression Mouse system, FM) for the persons with tetraplegia based on a monocular infrared depth camera is presented in this paper. The nose position along with the mouth status (close/open) is detected by the proposed algorithm to control and navigate the cursor as computer user input. The algorithm is based on an improved Randomized Decision Tree, which is capable of detecting the facial information efficiently and accurately. A more comfortable user ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
11
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 30 publications
(11 citation statements)
references
References 26 publications
0
11
0
Order By: Relevance
“…1 Comparison of face images taken at the same moment in unfavourable lighting conditions using visible (left) and thermal (right) camera GUI using head movements. In [5,23] and [37] the position of user's nostrils related to the face region is used. Interestingly, in [5], a depth imaging technique is adapted.…”
Section: Head Operated Interfacesmentioning
confidence: 99%
See 2 more Smart Citations
“…1 Comparison of face images taken at the same moment in unfavourable lighting conditions using visible (left) and thermal (right) camera GUI using head movements. In [5,23] and [37] the position of user's nostrils related to the face region is used. Interestingly, in [5], a depth imaging technique is adapted.…”
Section: Head Operated Interfacesmentioning
confidence: 99%
“…In [5,23] and [37] the position of user's nostrils related to the face region is used. Interestingly, in [5], a depth imaging technique is adapted. From a depth image, the nose position and the mouth status are detected and used for steering.…”
Section: Head Operated Interfacesmentioning
confidence: 99%
See 1 more Smart Citation
“…Kocejko et al [8] and Lupu et al [9] have controlled the mouse cursor by tracking the eye gaze movements of the user. Betke et al [10], Epstein et al [11], Nabati et al [12], Chareonsuk et al [13], Varona et al [14], Bian et al [15], Gorodnichy et al [16], Gyawal et al [17] and Morris et al [18] have avoided the overhead of using and head-mounted devices and high-cost hardware system by capturing the user's head motions through web camera to control the mouse pointer. Fathi et al [19] achieved this by tracking the eye movement whereas Sugano et al [20], Sambrekar et al [21] and M. Nasor et al [22] have attempted tracking the eye gazes.…”
Section: Introductionmentioning
confidence: 99%
“…Fathi et al [19] achieved this by tracking the eye movement whereas Sugano et al [20], Sambrekar et al [21] and M. Nasor et al [22] have attempted tracking the eye gazes. Betke et al [10], Nabati et al [12], Chareonsuk et al [13], Varona et al [14], Bian et al [15], Gorodnichy et al [16], Fathi et al [19], Hegde et al [23] and Arai et al [24] have developed the camera-based mouse replacement solutions for implementing mouse click events such as single and doubleclicking and dragging. Tracking and converting accurately the facial expression of the user to the mouse operation is still acknowledged as a research challenge and opportunity.…”
Section: Introductionmentioning
confidence: 99%