In this paper a novel application of multimodal emotion recognition algorithms in software engineering is described. Several application scenarios are proposed concerning program usability testing and software process improvement. Also a set of emotional states relevant in that application area is identified. The multimodal emotion recognition method that integrates video and depth channels, physiological signals and input devices usage patterns is proposed and some preliminary results on learning set creation are described.
This paper concerns measurement procedures on an emotion monitoring stand designed for tracking human emotions in the Human-Computer Interaction with physiological characteristics. The paper addresses the key problem of physiological measurements being disturbed by a motion typical for human-computer interaction such as keyboard typing or mouse movements. An original experiment is described, that aimed at practical evaluation of measurement procedures performed at the emotion monitoring stand constructed at GUT. Different locations of sensors were considered and evaluated for suitability and measurement precision in the HumanComputer Interaction monitoring. Alternative locations (ear lobes and forearms) for skin conductance, blood volume pulse and temperature sensors were proposed and verified. Alternative locations proved correlation with traditional locations as well as lower sensitiveness to movements like typing or mouse moving, therefore they can make a better solution for monitoring the Human-Computer Interaction.
The article presents a research study on recognizing therapy progress among children with autism spectrum disorder. The progress is recognized on the basis of behavioural data gathered via five specially designed tablet games. Over 180 distinct parameters are calculated on the basis of raw data delivered via the game flow and tablet sensors -i.e. touch screen, accelerometer and gyroscope. The results obtained confirm the possibility of recognizing progress in particular areas of development. The recognition accuracy exceeds 80%. Moreover, the study identifies a subset of parameters which appear to be better predictors of therapy progress than others. The proposed method -consisting of data recording, parameter calculation formulas and prediction models -might be implemented in a tool to support both therapists and parents of autistic children. Such a tool might be used to monitor the course of the therapy, modify it and report its results.Autism is a complex developmental disorder that influences the ability to communicate and learn. Autism is nowadays a growing challenge, as the number of children diagnosed with autism is rising worldwide 1 . The disorder exhibits a spectrum of symptoms that might range from mild to severe in a particular case, varying from skill to skill and from child to child. This makes diagnosis and therapy progress evaluation difficult and at the same time crucial for the effectiveness of the therapy. Early identification and proper therapy translates into a greater chance for preventing a person with autism from social exclusion. Therefore, any idea that might improve therapeutic practice is worth investigating. This paper presents one such idea, which incorporates the analysis of behavioural characteristics observed in autistic children while they play specially designed tablet games. This interdisciplinary study combines computer science and special pedagogy, by applying computer technologies and machine learning methods in the process of therapy for autistic children.Computer technologies may support both the diagnosis of autism and related therapy, although most solutions so far refer to the process of therapy. There are numerous applications designed for individuals with autism. These applications focus on particular issues by teaching specific skills-e.g. expressing needs, learning certain behaviours 2 , improving verbal communication, answering questions, interacting with other people in typical situations 3 , recognizing and expressing emotions 4-6 . These solutions take advantage of the fact, that children with autism are usually enthusiastic about tasks supported by computer technology, which offers a predictable framework without causing stress 2 .Another group of tools is designed for therapists. One of the most commonly used solutions are computer versions of standardized questionnaires that evaluate an individual's state 7 . Another popular way of supporting the therapy is to provide the experts with video recordings of the children's behaviour. However, there are also some...
Featured Application: (1) When you need emotions described in one representation model and get results in another form from an affect recognition system; (2) In a late fusion of hypotheses on affect from diverse algorithms (in multimodal emotion recognition); (3) In an evaluation of mappings between emotion representation models.Abstract: There are several models for representing emotions in affect-aware applications, and available emotion recognition solutions provide results using diverse emotion models. As multimodal fusion is beneficial in terms of both accuracy and reliability of emotion recognition, one of the challenges is mapping between the models of affect representation. This paper addresses this issue by: proposing a procedure to elaborate new mappings, recommending a set of metrics for evaluation of the mapping accuracy, and delivering new mapping matrices for estimating the dimensions of a Pleasure-Arousal-Dominance model from Ekman's six basic emotions. The results are based on an analysis using three datasets that were constructed based on affect-annotated lexicons. The new mappings were obtained with linear regression learning methods. The proposed mappings showed better results on the datasets in comparison with the state-of-the-art matrix. The procedure, as well as the proposed metrics, might be used, not only in evaluation of the mappings between representation models, but also in comparison of emotion recognition and annotation results. Moreover, the datasets are published along with the paper and new mappings might be created and evaluated using the proposed methods. The study results might be interesting for both researchers and developers, who aim to extend their software solutions with affect recognition techniques.
Abstract-This paper concerns the design and physical construction of an emotion monitor stand for tracking human emotions in Human-Computer Interaction using multi-modal approach. The concept of the stand using cameras, behavioral analysis tools and a set of physiological sensors such as galvanic skin response, blood-volume pulse, temperature, breath and electromyography is presented and followed by details of Emotion Monitor construction at Gdansk University of Technology. Some experiments are reported that were already held at the stand, providing observations on reliability, accuracy and value the stand might provide in human-systems interaction evaluation. The lessons learned at this particular stand might be interesting for the other researchers aiming at emotion monitoring in human-systems interaction.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.