This study analyzed five decomposition algorithms for separating electrodermal activity (EDA) into tonic and phasic components to identify different emotions using machine learning algorithms. We used EDA signals from the Continuously Annotated Signals of Emotion dataset for this analysis. First, we decomposed the EDA signals into tonic and phasic components using five decomposition methods: continuous deconvolution analysis, discrete deconvolution analysis, convex optimization-based EDA, nonnegative sparse deconvolution (SparsEDA), and BayesianEDA. We extracted time, frequency, and time-frequency domain features from each decomposition method’s tonic and phasic components. Finally, various machine learning algorithms such as logistic regression (LR), support vector machine, random forest, extreme gradient boosting, and multilayer perceptron were applied to evaluate the performance of the decomposition methods. Our results show that the considered decomposition methods successfully split the EDA signal into tonic and phasic components. The SparsEDA decomposition method outperforms the other decomposition methods considered in the study. In addition, LR with features extracted from the tonic component of the SparsEDA achieved highest average classification accuracy of 95.83%. This study can be used to identify the optimal decomposition methods suitable for emotion recognition applications.
In this study, we evaluated the performance of tonic and phasic components of Electrodermal activity (EDA) using machine learning algorithms for accurately recognizing emotions. The EDA signals considered for this study were obtained from Continuously Annotated Signals of Emotion (CASE) dataset. Initially, we pre-processed and decomposed the EDA into tonic and phasic components using cvxEDA method. Further, we extracted the temporal and morphological features from both tonic and phasic. Finally, we tested the performance of various combinations of features using machine learning algorithms such as logistic regression, support vector machine (SVM), and random forest. Our results revealed that the tonic contributes significant information for emotional state classification. Further, the temporal features of the phasic were able to discriminate most of the emotions [Formula: see text]. In particular, the scary emotion was well discriminated against other emotions. Results of classification revealed that SVM performed best in classifying emotional states. The results of our process pipeline, which incorporated tonic, temporal features, and SVM, showed impressive classification performance with average accuracy, sensitivity, specificity, precision, and f1-score of 78.96%, 57.92%, 85.97%, 62.32%, and 56.48%, respectively. Our findings indicate that our proposed models could potentially be used to detect the positive and negative emotions in healthcare settings.
In this study, we attempted to classify categorical emotional states using Electrodermal Activity (EDA) signals and a configurable Convolutional Neural Network (cCNN). The EDA signals from the publicly available, Continuously Annotated Signals of Emotion dataset were down-sampled and decomposed into phasic components using the cvxEDA algorithm. The phasic component of EDA was subjected to Short-Time Fourier Transform-based time-frequency representation to obtain spectrograms. These spectrograms were input to the proposed cCNN to automatically learn the prominent features and discriminate varied emotions such as amusing, boring, relaxing, and scary. Nested k-Fold cross-validation was used to evaluate the robustness of the model. The results indicated that the proposed pipeline could discriminate the considered emotional states with a high average classification accuracy, recall, specificity, precision, and F-measure scores of 80.20%, 60.41%, 86.8%, 60.05%, and 58.61%, respectively. Thus, the proposed pipeline could be valuable in examining diverse emotional states in normal and clinical conditions.
Electrodermal activity (EDA) reflects sympathetic nervous system activity through sweating-related changes in skin conductance. Decomposition analysis is used to deconvolve the EDA into slow and fast varying tonic and phasic activity, respectively. In this study, we used machine learning models to compare the performance of two EDA decomposition algorithms to detect emotions such as amusing, boring, relaxing, and scary. The EDA data considered in this study were obtained from the publicly available Continuously Annotated Signals of Emotion (CASE) dataset. Initially, we pre-processed and deconvolved the EDA data into tonic and phasic components using decomposition methods such as cvxEDA and BayesianEDA. Further, 12 time-domain features were extracted from the phasic component of EDA data. Finally, we applied machine learning algorithms such as logistic regression (LR) and support vector machine (SVM), to evaluate the performance of the decomposition method. Our results imply that the BayesianEDA decomposition method outperforms the cvxEDA. The mean of the first derivative feature discriminated all the considered emotional pairs with high statistical significance (p<0.05). SVM was able to detect emotions better than the LR classifier. We achieved a 10-fold average classification accuracy, sensitivity, specificity, precision, and f1-score of 88.2%, 76.25%, 92.08%, 76.16%, and 76.15% respectively, using BayesianEDA and SVM classifiers. The proposed framework can be utilized to detect emotional states for the early diagnosis of psychological conditions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.