Abstract-The work presented in this paper aims at assessing human emotion recognition by means of the analysis of the heart rate variability (HRV) with varying spectral bands based on respiratory frequency (RF). Three specific emotional states are compared corresponding to calm-neutral state (Relax), positive elicitation (Joy) and negative elicitation (Fear). Standard HRV analysis in time and frequency domain is performed. In order to better characterize the HRV component related to respiratory sinus arrhythmia, the high frequency (HF) band is centered on RF. Results reveal that the power content in low band (PLF), the normalized power content in HF band (PHFn) and the sympathovagal ratio (LF/HF) can be suitable indices to distinguish Relax and Joy. Mean heart rate and RF are significantly different between Relax and Fear. Different HRV indices show significant differences between Joy and Fear, such as pNN50, PLF, PHFn and LF/HF. Statistical analysis of HRV indices with HF centered in the RF results in a lower p-value than the ones with a HF standard band.
Objective: Interest in emotion recognition has increased in recent years as a useful tool for diagnosing psycho-neural illnesses. In this study, the Auto-Mutual and the Cross-Mutual Information Function, AMIF and CMIF respectively, are used for human emotion recognition. Approach: The AMIF technique was applied to heart rate variability (HRV) signals to study complex interdependencies, and the CMIF technique was considered to quantify the complex coupling between HRV and respiratory signals. Both algorithms were adapted to short-term RR time series. Traditional band pass filtering was applied to the RR series at low frequency (LF) and high frequency (HF) bands, and a respiration-based filter bandwidth was also investigated (HF SCHF). Both the AMIF and the CMIF algorithms were calculated with regard to different time scales as specific complexity measures. The ability of the parameters derived from the AMIF and the CMIF to discriminate emotions was evaluated on a database of video-induced emotion elicitation. Five elicited states i.e. relax (neutral), joy (positive valence), as well as fear, sadness and anger (negative valences) were considered. Main results: The results revealed that the AMIF applied to the RR time series filtered in the HF SCHF band was able to discriminate between: relax and joy and fear, joy and each negative valence conditions and finally fear and sadness and anger, all with a statistical significance level p-value ≤ 0.05, sensitivity, specificity and accuracy higher than 70% and area under the receiver operating characteristic curve index AUC ≥ 0.70. Furthermore, the parameters derived from the AMIF and the CMIF allowed the low signal complexity presented during fear to be characterized in front of any of the studied elicited states. Significance: Based on these results, human emotion manifested in the HRV and respiratory signal responses could be characterized by means of the information-content complexity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.