Emotions expressed by humans can be identified from facial expressions, speech signals, or physiological signals. Among them, the use of physiological signals for emotion classification is a notable emerging area of research. In emotion recognition, a person’s electrocardiogram (ECG) and galvanic skin response (GSR) signals cannot be manipulated, unlike facial and voice signals. Moreover, wearables such as smartwatches and wristbands enable the detection of emotions in people’s naturalistic environment. During the COVID-19 pandemic, it was necessary to detect people’s emotions in order to ensure that appropriate actions were taken according to the prevailing situation and achieve societal balance. Experimentally, the duration of the emotion stimulus period and the social and non-social contexts of participants influence the emotion classification process. Hence, classification of emotions when participants are exposed to the elicitation process for a longer duration and taking into consideration the social context needs to be explored. This work explores the classification of emotions using five pretrained convolutional neural network (CNN) models: MobileNet, NASNetMobile, DenseNet 201, InceptionResnetV2, and EfficientNetB7. The continuous wavelet transform (CWT) coefficients were detected from ECG and GSR recordings from the AMIGOS database with suitable filtering. Scalograms of the sum of frequency coefficients versus time were obtained and converted into images. Emotions were classified using the pre-trained CNN models. The valence and arousal emotion classification accuracy obtained using ECG and GSR data were, respectively, 91.27% and 91.45% using the InceptionResnetV2 CNN classifier and 99.19% and 98.39% using the MobileNet CNN classifier. Other studies have not explored the use of scalograms to represent ECG and GSR CWT features for emotion classification using deep learning models. Additionally, this study provides a novel classification of emotions built on individual and group settings using ECG data. When the participants watched long-duration emotion elicitation videos individually and in groups, the accuracy was around 99.8%. MobileNet had the highest accuracy and shortest execution time. These subject-independent classification methods enable emotion classification independent of varying human behavior.
This chapter analyzes 57 articles published from 2012 on emotion classification using bio signals such as ECG and GSR. This study would be valuable for future researchers to gain an insight into the emotion model, emotion elicitation and self-assessment techniques, physiological signals, pre-processing methods, feature extraction, and machine learning techniques utilized by the different researchers. Most investigators have used openly available databases, and some have created their datasets. The studies have considered the participants from the healthy age group and of similar cultural backgrounds. Fusion of the ECG and GSR parameters can help to improve classification accuracy. Additionally, handcrafted features fused with automatically extracted deep machine learning features can increase classification accuracy. Deep learning techniques and feature fusion techniques have improved classification accuracy.
In order to have a working bio-particle analysis system, a method of capturing the particles from the air into the liquid is required. Here, we report a complete MEMS system that includes an air-to-liquid MEMS interface (made of glass and PDMS) for airborne bioparticle (ClOpm) analysis, and demonstrate its successful integration with our DEP(die1ectrophoretic) particle transportation[ 11 and active filter membrane [2] technology. Two types of air-to-liquid interfaces were investigated. The first, consisted of a stationary meniscus with moving particles; and second, stationary particles with an oscillating liquid meniscus. Due to large interfacial forces required in penetrating the liquid meniscus, the first design performed inadequately.However, these roadblocks were eliminated in the second technique and demonstrated as a working system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.