This paper presents a new Hidden Markov Model based approach for fast and automatic detection and classification of head movements in real time dynamic videos. The model has been developed to utilize human-computer interaction applications by using only the laptop webcam. The proposed model has the ability to predict single head and combined simultaneously in fast responses. Other models paid more attention to classify head nod and shake only, but our model contribute the role of other head movements. The model proposed here doesn’t need any user intervention or previous knowledge of its environment. In addition, there is no limitation on illumination changes and occlusions, as well as no restrictions on head movements ranges. The model achieved significant results and efficient performances when tested on unseen data. As the model accuracies were 94%, 99%, 83%, 87%, 93%, 96% for all head gestures (rest, nod, turn, shake, tilt and tilting) respectively. On the other hand, the model accuracy was 99% and 88% for combined and single cues respectively. The aim of this model is to provide a fast application to infer and predict human emotions and affective states in real time through head gestures.
Any viable algorithm to infer affective states of individuals with autism requires natural and reliable data in real time and in an uncontrolled environment. For this purpose, this study provides a new natural-spontaneous affective-cognitive dataset based on facial expressions, eye gaze, and head movements for adult students with and without Asperger syndrome (AS). The data gathering and collecting in a computer-based learning environment is one of the significant areas, which has attracted researchers' attention in affective computing applications. Due to the important impact of emotions on students learning outcome and their performance, the dataset included a range of affective-cognitive states which goes beyond basic emotions. This study reports the methodology that was used in data collection and annotation. Description and comparison of other available datasets were summarized, and also the study presents the results that were concluded in more details. In addition, some challenges were inherent to this study.
Recently, a lot of studies have been interested in recognizing and detection of emotions in people with autism. The main goal of this paper is to survey different studies which have been concerned emotional state of people with autism. The survey includes two parts, first one focused on studies which use facial expressions to recognize and detect emotions. As facial expressions are considered the affective and important techniques which is used to express different patterns of emotions. Second parts of this study, focuses on different technical methods like machine learning, deep learning and other algorithms that are employed to analyze and determine the facial behaviors of people with autism. To find the optimal solution, a comparison of current emotion-detecting systems is investigated in this paper.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.