Physiological signals may be used as objective markers to identify emotions, which play relevant roles in social and daily life. To measure these signals, the use of contact-free techniques, such as Infrared Thermal Imaging (IRTI), is indispensable to individuals who have sensory sensitivity. The goal of this study is to propose an experimental design to analyze five emotions (disgust, fear, happiness, sadness and surprise) from facial thermal images of typically developing (TD) children aged 7–11 years using emissivity variation, as recorded by IRTI. For the emotion analysis, a dataset considered emotional dimensions (valence and arousal), facial bilateral sides and emotion classification accuracy. The results evidence the efficiency of the experimental design with interesting findings, such as the correlation between the valence and the thermal decrement in nose; disgust and happiness as potent triggers of facial emissivity variations; and significant emissivity variations in nose, cheeks and periorbital regions associated with different emotions. Moreover, facial thermal asymmetry was revealed with a distinct thermal tendency in the cheeks, and classification accuracy reached a mean value greater than 85%. From the results, the emissivity variations were an efficient marker to analyze emotions in facial thermal images, and IRTI was confirmed to be an outstanding technique to study emotions. This study contributes a robust dataset to analyze the emotions of 7-11-year-old TD children, an age range for which there is a gap in the literature.
Child-Robot Interaction (CRI) has become increasingly addressed in research and applications. This work proposes a system for emotion recognition in children, recording facial images by both visual (RGB—red, green and blue) and Infrared Thermal Imaging (IRTI) cameras. For this purpose, the Viola-Jones algorithm is used on color images to detect facial regions of interest (ROIs), which are transferred to the thermal camera plane by multiplying a homography matrix obtained through the calibration process of the camera system. As a novelty, we propose to compute the error probability for each ROI located over thermal images, using a reference frame manually marked by a trained expert, in order to choose that ROI better placed according to the expert criteria. Then, this selected ROI is used to relocate the other ROIs, increasing the concordance with respect to the reference manual annotations. Afterwards, other methods for feature extraction, dimensionality reduction through Principal Component Analysis (PCA) and pattern classification by Linear Discriminant Analysis (LDA) are applied to infer emotions. The results show that our approach for ROI locations may track facial landmarks with significant low errors with respect to the traditional Viola-Jones algorithm. These ROIs have shown to be relevant for recognition of five emotions, specifically disgust, fear, happiness, sadness, and surprise, with our recognition system based on PCA and LDA achieving mean accuracy (ACC) and Kappa values of 85.75% and 81.84%, respectively. As a second stage, the proposed recognition system was trained with a dataset of thermal images, collected on 28 typically developing children, in order to infer one of five basic emotions (disgust, fear, happiness, sadness, and surprise) during a child-robot interaction. The results show that our system can be integrated to a social robot to infer child emotions during a child-robot interaction.
Introduction: Autism Spectrum Disorder is a set of developmental disorders that imply in poor social skills, lack of interest in activities and interaction with people. Treatments rely on teaching social skills and in such therapies robotics may offer aid. This work is a pilot study, which aims to show the development and usage of a ludic mobile robot for stimulating social skills in ASD children. Methods: A mobile robot with a special costume and a monitor to display multimedia contents was designed to interact with ASD children. A mediator controls the robot's movements in a room prepared for interactive sessions. Sessions are recorded to assess the following social skills: eye gazing, touching the robot and imitating the mediator. The interaction is evaluated using the Goal Attainment Scale and Likert scale. Ten children were evaluated (50% with ASD), using as inclusion criteria children with age 7-8, without use of medication, and without tendency to aggression or stereotyped movements. Results: It was observed that the ASD group touched the robot about twice more in average than the control group (CG). They also looked away and imitated the mediator in a quite similar way as the CG, and showed extra social skills (verbal and non-verbal communication). These results are considered an advance in terms of improvement of social skills in ASD children. Conclusions: Our studies indicate that the robot stimulated social skills in 4/5 of the ASD children, which shows that its concepts are useful to improve socialization and quality of life.
This paper presents the development of a smart walker that uses a formation controller in its displacements. Encoders, a laser range finder and ultrasound are the sensors used in the walker. The control actions are based on the user (human) location, who is the actual formation leader. There is neither a sensor attached to the user’s body nor force sensors attached to the arm supports of the walker, and thus, the control algorithm projects the measurements taken from the laser sensor into the user reference and, then, calculates the linear and angular walker’s velocity to keep the formation (distance and angle) in relation to the user. An algorithm was developed to detect the user’s legs, whose distances from the laser sensor provide the information necessary to the controller. The controller was theoretically analyzed regarding its stability, simulated and validated with real users, showing accurate performance in all experiments. In addition, safety rules are used to check both the user and the device conditions, in order to guarantee that the user will not have any risks when using the smart walker. The applicability of this device is for helping people with lower limb mobility impairments.
Recently, studies on cycling-based brain–computer interfaces (BCIs) have been standing out due to their potential for lower-limb recovery. In this scenario, the behaviors of the sensory motor rhythms and the brain connectivity present themselves as sources of information that can contribute to interpreting the cortical effect of these technologies. This study aims to analyze how sensory motor rhythms and cortical connectivity behave when volunteers command reactive motor imagery (MI) BCI that provides passive pedaling feedback. We studied 8 healthy subjects who performed pedaling MI to command an electroencephalography (EEG)-based BCI with a motorized pedal to receive passive movements as feedback. The EEG data were analyzed under the following four conditions: resting, MI calibration, MI online, and receiving passive pedaling (on-line phase). Most subjects produced, over the foot area, significant event-related desynchronization (ERD) patterns around Cz when performing MI and receiving passive pedaling. The sharpest decrease was found for the low beta band. The connectivity results revealed an exchange of information between the supplementary motor area (SMA) and parietal regions during MI and passive pedaling. Our findings point to the primary motor cortex activation for most participants and the connectivity between SMA and parietal regions during pedaling MI and passive pedaling.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.