In this work, we present methods for the personalization of a system for the continuous estimation of pain intensity from bio-physiological channels. We investigate various ways to estimate the similarity of persons and to retrieve the most informative ones using meta information, personality traits and machine learning techniques. Given this information, specialized classifiers can be created that are both, more efficient in terms of complexity and training times and also more accurate than classifiers trained on the complete data. To capture the most information in the different bio-physiological channels, we cover a broad spectrum of different feature extraction algorithms. Furthermore, we show that the system is capable of running in real-time and discuss issues that arise when dealing with incremental data processing. In extensive experiments we verify the validity of our approach.
Abstract. Research activities in the field of human-computer interaction increasingly addressed the aspect of integrating some type of emotional intelligence. Human emotions are expressed through different modalities such as speech, facial expressions, hand or body gestures, and therefore the classification of human emotions should be considered as a multimodal pattern recognition problem. The aim of our paper is to investigate multiple classifier systems utilizing audio and visual features to classify human emotional states. For that a variety of features have been derived. From the audio signal the fundamental frequency, LPCand MFCC coefficients, and RASTA-PLP have been used. In addition to that two types of visual features have been computed, namely form and motion features of intermediate complexity. The numerical evaluation has been performed on the four emotional labels Arousal, Expectancy, Power, Valence as defined in the AVEC data set. As classifier architectures multiple classifier systems are applied, these have been proven to be accurate and robust against missing and noisy data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.