Children with Autism Spectrum Disorder are identified as a group of people who has difficulties in socio-emotional interaction. Most of them lack the proper context in producing social response through facial expression and speech. Since emotion is the key for effective social interaction, it is justifiably vital for them to comprehend the correct emotion expressions and recognitions. Emotion is a type of affective states and can be detected through physical reaction and physiological signals. In general, recognition of affective states from physical reaction such as facial expression and speech for autistic children is often unpredictable. Hence, an alternative method of identifying the affective states through physiological signals is proposed. Though considered non-invasive, most of the current recognition methods require sensors to be patched on to the skin body to measure the signals. This would most likely cause discomfort to the children and mask their "true" affective states. The study proposed the use of thermal imaging modality as a passive medium to analyze the physiological signals associated with the affective states nonobtrusively. The study hypothesized that, the impact of cutaneous temperature changes due to the pulsating blood flow in the blood vessels at the frontal face area measured from the modality could have a direct impact to the different affective states of autistic children. A structured experimental setup was designed to measure thermal imaging data generated from different affective state expressions induced using different sets of audio-video stimuli. A wavelet-based technique for pattern detection in time series was deployed to spot the changes measured from the region of interest. In the study, the affective state model for typical developing children aged between 5 and 9 years old was used as the baseline to evaluate the performance of the affective state classifier for autistic children. The results from the classifier showed the efficacy of the technique and accorded good performance of classification accuracy at 88% in identifying the affective states of autistic children. The results were momentous in distinguishing basic affective states and the information could provide a more effective response towards improving social-emotion interaction amongst the autistic children. INDEX TERMS Autism, Affective States, Facial Skin Temperatures, Thermoregulation, Thermal Images, Wavelet I. INTRODUCTION F OR decades, numbers of research have been conducted in the field of affective state recognition, using numerous modalities and exploiting various features from the signal generated by the Autonomous Nervous System (ANS). ANS together with hypothalamus, regulates the blood pressure, breathing, pulse and arousal in response to different emotional features. Seeing the big potentials of affective state
Ahstract-Human Robot Interaction is a multidisciplinary field which involves developing, perceiving and assessing robotic systems. In effective communication, the understanding of emotions and intentions are essential. A robotic system that is capable of recognizing emotion states and synthesizing proper response would be beneficial for Human Robot Interaction (HRI). Human emotion recognition could be achieved through affective state computing method. Emotional state assessment of humans has been traditionally studied using various direct physiological measures and psychological self-reports. There are several measures to identify human emotional states such as gestures, facial images, physiological signals and neuro-imaging methods. However, some of these approaches require sizeable and expensive equipment which might hinder free movement. Recent finding shows that facial cutaneous temperature and its topographic distribution exhibited specific features clearly correlated to emotional arousal and associated measures of standard physiological signals of the sympathetic activity. The results of this study indicate thermal imaging as an alternative, contactless and non-invasive method for assessing individual's emotional arousal in psychophysiology.
Electromyography (EMG) signal is one of the extensively utilised biological signals for predicting human motor intention, which is an essential element in human-robot collaboration platforms. Studies on motion intention prediction from EMG signals have often been concentrated on either classification and regression models of muscle activity. In this study, we leverage the information from the EMG signals, to detect the subject’s intentions in generating motion commands for a robot-assisted upper limb rehabilitation platform. The EMG signals are recorded from ten healthy subjects’ biceps muscle, and the movements of the upper limb evaluated are voluntary elbow flexion and extension along the sagittal plane. The signals are filtered through a fifth-order Butterworth filter. A number of features were extracted from the filtered signals namely waveform length (WL), mean absolute value (MAV), root mean square (RMS), standard deviation (SD), minimum (MIN) and maximum (MAX). Several different classifiers viz. Linear Discriminant Analysis (LDA), Logistic Regression (LR), Decision Tree (DT), Support Vector Machine (SVM) and k-Nearest Neighbour (k-NN) were investigated on its efficacy to accurately classify the pre-intention and intention classes based on the significant features identified (MIN and MAX) via Extremely Randomised Tree feature selection technique. It was observed from the present investigation that the DT classifier yielded an excellent classification with a classification accuracy of 100%, 99% and 99% on training, testing and validation dataset, respectively based on the identified features. The findings of the present investigation are non-trivial towards facilitating the rehabilitation phase of patients based on their actual capability and hence, would eventually yield a more active participation from them.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.