Emotion estimation systems based on brain and physiological signals such as electro encephalography (EEG), blood-volume pressure (BVP), and galvanic skin response (GSR) are gaining special attention in recent years due to the possibilities they offer. The field of human–robot interactions (HRIs) could benefit from a broadened understanding of the brain and physiological emotion encoding, together with the use of lightweight software and cheap wearable devices, and thus improve the capabilities of robots to fully engage with the users emotional reactions. In this paper, a previously developed methodology for real-time emotion estimation aimed for its use in the field of HRI is tested under realistic circumstances using a self-generated database created using dynamically evoked emotions. Other state-of-the-art, real-time approaches address emotion estimation using constant stimuli to facilitate the analysis of the evoked responses, remaining far from real scenarios since emotions are dynamically evoked. The proposed approach studies the feasibility of the emotion estimation methodology previously developed, under an experimentation paradigm that imitates a more realistic scenario involving dynamically evoked emotions by using a dramatic film as the experimental paradigm. The emotion estimation methodology has proved to perform on real-time constraints while maintaining high accuracy on emotion estimation when using the self-produced dynamically evoked emotions multi-signal database.
Affective human-robot interaction requires lightweight software and cheap wearable devices that could further this field. However, the estimation of emotions in real-time poses a problem that has not yet been optimized. An optimization is proposed for the emotion estimation methodology including artifact removal, feature extraction, feature smoothing, and brain pattern classification. The challenge of filtering artifacts and extracting features, while reducing processing time and maintaining high accuracy results, is attempted in this work. First, two different approaches for real-time electro-oculographic artifact removal techniques are tested and compared in terms of loss of information and processing time. Second, an emotion estimation methodology is proposed based on a set of stable and meaningful features, a carefully chosen set of electrodes, and the smoothing of the feature space. The methodology has proved to perform on real-time constraints while maintaining high accuracy on emotion estimation on the SEED database, both under subject dependent and subject independent paradigms, to test the methodology on a discrete emotional model with three affective states.
The large range of potential applications, not only for patients but also for healthy people, that could be achieved by affective BCI (aBCI) makes more latent the necessity of finding a commonly accepted protocol for real-time EEG-based emotion recognition. Based on wavelet package for spectral feature extraction, attending to the nature of the EEG signal, we have specified some of the main parameters needed for the implementation of robust positive and negative emotion classification. 12 seconds has resulted as the most appropriate sliding window size; from that, a set of 20 target frequencylocation variables have been proposed as the most relevant features that carry the emotional information. Lastly, QDA and KNN classifiers and population rating criterion for stimuli labeling have been suggested as the most suitable approaches for EEG-base emotion recognition. The proposed model reached a mean accuracy of 98% (s.d. 1.4) and 98.96% (s.d. 1.28) in a subject-dependent approach for QDA and KNN classifier, respectively. This new model represents a step forward towards real-time classification. Although results were not conclusive, new insights regarding subjectindependent approximation have been discussed. (J.C. Fernandez-Troyano), mval33@alumno.uned.es (Mikel Val-Calvo), jm.ferrandez@upct.es (J.M. Ferrández), e.fernandez@umh.es (Eduardo Fernandez)
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.