Eye movements can be used as alternative inputs for human-computer interface (HCI) systems such as virtual or augmented reality systems as well as new communication ways for patients with locked-in syndrome. In this study, we developed a real-time electrooculogram (EOG)-based eye-writing recognition system, with which users can write predefined symbolic patterns with their volitional eye movements. For the "eye-writing" recognition, the proposed system first reconstructs the eye-written traces from EOG waveforms in real-time; then, the system recognizes the intended symbolic inputs with a reliable recognition rate by matching the input traces with the trained eye-written traces of diverse input patterns. Experiments with 20 participants showed an average recognition rate of 87.38% (F1 score) for 29 different symbolic patterns (26 lower case alphabet characters and three functional input patterns representing Space, Backspace, and Enter keys), demonstrating the promise of our EOG-based eye-writing recognition system in practical scenarios.
BackgroundElectrooculogram (EOG) can be used to continuously track eye movements and can thus be considered as an alternative to conventional camera-based eye trackers. Although many EOG-based eye tracking systems have been studied with the ultimate goal of providing a new way of communication for individuals with amyotrophic lateral sclerosis (ALS), most of them were tested with healthy people only. In this paper, we investigated the feasibility of EOG-based eye-writing as a new mode of communication for individuals with ALS.MethodsWe developed an EOG-based eye-writing system and tested this system with 18 healthy participants and three participants with ALS. We also applied a new method for removing crosstalk between horizontal and vertical EOG components. All study participants were asked to eye-write specially designed patterns of 10 Arabic numbers three times after a short practice session.ResultsOur system achieved a mean recognition rates of 95.93% for healthy participants and showed recognition rates of 95.00%, 66.67%, and 93.33% for the three participants with ALS. The low recognition rates in one of the participants with ALS was mainly due to miswritten letters, the number of which decreased as the experiment proceeded.ConclusionOur proposed eye-writing system is a feasible human-computer interface (HCI) tool for enabling practical communication of individuals with ALS.
The increase in the number of adolescents with internet gaming disorder (IGD), a type of behavioral addiction is becoming an issue of public concern. Teaching adolescents to suppress their craving for gaming in daily life situations is one of the core strategies for treating IGD. Recent studies have demonstrated that computer-aided treatment methods, such as neurofeedback therapy, are effective in relieving the symptoms of a variety of addictions. When a computer-aided treatment strategy is applied to the treatment of IGD, detecting whether an individual is currently experiencing a craving for gaming is important. We aroused a craving for gaming in 57 adolescents with mild to severe IGD using numerous short video clips showing gameplay videos of three addictive games. At the same time, a variety of biosignals were recorded including photoplethysmogram, galvanic skin response, and electrooculogram measurements. After observing the changes in these biosignals during the craving state, we classified each individual participant’s craving/non-craving states using a support vector machine. When video clips edited to arouse a craving for gaming were played, significant decreases in the standard deviation of the heart rate, the number of eye blinks, and saccadic eye movements were observed, along with a significant increase in the mean respiratory rate. Based on these results, we were able to classify whether an individual participant felt a craving for gaming with an average accuracy of 87.04%. This is the first study that has attempted to detect a craving for gaming in an individual with IGD using multimodal biosignal measurements. Moreover, this is the first that showed that an electrooculogram could provide useful biosignal markers for detecting a craving for gaming.
In traditional brain-computer interface (BCI) studies, binary communication systems have generally been implemented using two mental tasks arbitrarily assigned to “yes” or “no” intentions (e.g., mental arithmetic calculation for “yes”). A recent pilot study performed with one paralyzed patient showed the possibility of a more intuitive paradigm for binary BCI communications, in which the patient’s internal yes/no intentions were directly decoded from functional near-infrared spectroscopy (fNIRS). We investigated whether such an “fNIRS-based direct intention decoding” paradigm can be reliably used for practical BCI communications. Eight healthy subjects participated in this study, and each participant was administered 70 disjunctive questions. Brain hemodynamic responses were recorded using a multichannel fNIRS device, while the participants were internally expressing “yes” or “no” intentions to each question. Different feature types, feature numbers, and time window sizes were tested to investigate optimal conditions for classifying the internal binary intentions. About 75% of the answers were correctly classified when the individual best feature set was employed (75.89% ± 1.39 and 74.08% ± 2.87 for oxygenated and deoxygenated hemoglobin responses, respectively), which was significantly higher than a random chance level (68.57% for p < 0.001). The kurtosis feature showed the highest mean classification accuracy among all feature types. The grand-averaged hemodynamic responses showed that wide brain regions are associated with the processing of binary implicit intentions. Our experimental results demonstrated that direct decoding of internal binary intention has the potential to be used for implementing more intuitive and user-friendly communication systems for patients with motor disabilities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.