<p><b>Emotion categorization has become an increasingly important area of research due to the rising number of intelligent systems. Artificial classifiers have demonstrated limited competency in classifying different emotions and have widely been used in recent years to facilitate the task of emotion categorization. Conversely, it requires time and is sometimes hard for human classifiers to agree with each other on the facial expression categorization tasks. Hence, this thesis will consider how the combination of human and artificial classifiers can lead to improvements in emotion classification. Further, as emotions are not only communicative tools that are reflected on the face, this thesis will also investigate how emotions are reflected in the body and how that can affect the decision-making process.</b></p>
<p>Existing methods of emotion categorization from visual data using deep learning algorithms analyze the emotion by representing knowledge in a homogeneous way. As a result, a small change to the input image made by an adversary or due to the occlusion of the partial region of the face might result in a large decrease in the accuracy of these algorithms.</p>
<p>The proposed thesis is that an artificial system designed based on inspiration from neuro-scientific theory or the natural way humans categorize emotions, e.g. considering the mouth, eyes, jaw etc. can obtain better accuracy in emotion categorization than existing state-of-the-art techniques, which rely on homogeneous knowledge representation, i.e. consider the color within pixels of an image as equally important. The comprehensive goal is to create different emotion categorization methods, inspired by neuro-scientific processes and the way humans categorize emotion. These methods are to be tested on different emotional facial expression datasets. In addition, this thesis will also investigate how emotions are reflected in the body, i.e. in the skin conductance and heart rate, as well as analyzes what people want to feel and how that can affect their decision-making. Understanding what people want to feel and how that can affect their decision-making can lead to the production of artificially intelligent systems for real-world situations.</p>
<p>The first academic contribution is related to the novel methods linking and transferring knowledge learned based on the strategies of the brain and the way humans categorize emotions. The second academic contribution is related to novel approaches being able to predict and identify patterns that contribute to predicting heart rate and skin conductance in the context of emotion. The last academic contribution is related to the findings that emotionally-aware systems will make better and more relevant decisions in shared workspaces than non-emotionally-aware ones.</p>
<p>This thesis proposes a landmark-based method for categorizing emotion from images. The novel landmark-based method extracts facial features that are resistant to attacks by identifying salient features related to emotion regardless of the size, shape, and proportions of the face exhibiting the emotion. Experimental results showed that the novel landmark-based method can achieve better performance in terms of accuracy (> 97%) and computational cost (< 46 mins) in comparison to state-of-the-art methods (with < 91%-accuracy and > 119 hrs-time, respectively) for both in-distribution and out-of-distribution analysis.</p>
<p>Furthermore, this thesis created a lateralized system by adapting the lateralized framework for computer vision and considering both constituents (e.g. mouth, eyes, jaw. etc.) and holistic (whole face) features to categorize emotion from images. The novel system successfully exhibited robustness against adversarial attacks by applying lateralization. The ability to simultaneously consider the parts of the face (constituents level) and the whole face (holistic level) empowers the lateralized system to correctly classify emotions and show stronger resistance to changes (10.86–47.72% decrease) in comparison to the state-of-the-art methods (25.15–83.43% decrease).</p>
<p>This thesis also proposes a novel lateralized landmark-based method mirroring the human lateralized systems to categorize emotion from images. The novel system successfully exhibited robustness against attacks by using emotion-relevant features from the face exhibiting the emotion (instead of using pixels color), as well as simultaneously considering both constituents and holistic level predictions. The novel hybrid method was shown to achieve significantly higher accuracy (< 26% decrease) compared with various state-of-the-art methods (> 67% decrease) when tested on datasets with sufficient data to cover the common situation.</p>
<p>Both landmark-based and lateralized systems categorize emotion from full facial images. However, the use of partial face coverings such as sunglasses and face masks, which are becoming very common nowadays unintentionally obscure facial expressions, causing a loss of accuracy when humans and computer systems attempt to categorise emotion. With the rise of soft computing techniques interacting with humans, it is important to know not just their accuracy, but also the confusion errors being made—do humans make less random/damaging errors than soft computing? Therefore, this thesis compared the accuracy of humans and computer systems in categorizing emotion from faces partially covered with sunglasses and face masks. The results suggest that although the accuracy of both human and computer systems decreases when the face is partially covered with sunglasses and face masks, the performance of machine learning classifiers is greatly impacted (> 74%) in comparison to humans (< 26%).</p>
<p>This thesis further proposes the first attention-based method to improve the classification accuracy of benchmark machine learning classifiers when categorizing emotion from the images of people partially covered with sunglasses and face masks, by paying more attention to uncovered regions of the face. The ability to detect occluded regions of the face based on the covering type and paying more attention to the uncovered regions empowers the novel attention-based method to correctly classify emotions from partially covered faces. Experimental results showed that the novel attention-based method was shown to perform better than the benchmark approaches by a significant amount (up to 50.26% increase).</p>
<p>Moving to the broader human expression, this thesis determines how emotions are reflected in the body by analyzing the moment-by-moment brain activity to predict emotional arousal-related autonomic nervous responses of participants as they watched emotion-provoking videos. The results suggest that predicting continuous autonomic responses such as heart rate and galvanic skin responses requires an approach capable of learning dependence or sequential feature selection to improve prediction performance. The results also suggest that specific brain regions and peripheral measures support differential processing of heart rate and galvanic skin response as the prediction error was significantly reduced using only a small number of feature subsets than using all features from the dataset.</p>
<p>Lastly, this thesis investigates what people want to feel, as well as determines the impact of emotion on social economic decision-making. The results suggest that participants predominantly preferred experiencing happiness over anger even when they expected anger to be beneficial for goal pursuit. However, they preferred experiencing happiness to a lesser extent when they were told anger would be beneficial. The results demonstrate that experiencing a specific emotion (either anger or happiness) does not promote successful confrontation performance. In spite of that, further findings from the study suggest that emotion-awareness is an important factor in determining participants' performance, as participants higher in emotion-awareness were found to perform significantly better in comparison to participants low in emotion-awareness. </p>
<p>In conclusion, this thesis develops a range of heterogeneous and attention-based methods mimicking human cognitive processing, analyzes how emotion is reflected in the body, as well as analyzes what people want to feel and how that can affect their decision-making. The overall results showed that heterogeneous and attention-based methods can achieve better accuracy in emotion categorization tasks than existing methods that utilize pixels. The results also showed that specific moment-by-moment brain activity captured in the context of emotions supports differential processing of heart rate and galvanic skin response. Further, the results also suggest that emotionally-aware artificial intelligent systems if produced can make better and more relevant decisions in shared workspaces. This will lead to the development of artificial affective decision-making techniques, especially suited to dynamic/uncertain domains that elicit emotions in humans.</p>