We consider the problem of detecting anomalies in the directional distribution of fibre materials observed in 3D images. We divide the image into a set of scanning windows and classify them into two clusters: homogeneous material and anomaly. Based on a sample of estimated local fibre directions, for each scanning window we compute several classification attributes, namely the coordinate wise means of local fibre directions, the entropy of the directional distribution, and a combination of them. We also propose a new spatial modification of the Stochastic Approximation Expectation-Maximization (SAEM) algorithm. Besides the clustering we also consider testing the significance of anomalies. To this end, we apply a change point technique for random fields and derive the exact inequalities for tail probabilities of a test statistics. The proposed methodology is first validated on simulated images. Finally, it is applied to a 3D image of a fibre reinforced polymer.
As emotions play a central role in human communication, automatic emotion recognition has attracted increasing attention in the last two decades. While multimodal systems enjoy high performances on lab-controlled data, they are still far from providing ecological validity on non-lab-controlled, namely “in-the-wild” data. This work investigates audiovisual deep learning approaches to emotion recognition in in-the-wild problem. Inspired by the outstanding performance of end-to-end and transfer learning techniques, we explored the effectiveness of architectures in which a modality-specific Convolutional Neural Network (CNN) is followed by a Long Short-Term Memory Recurrent Neural Network (LSTM-RNN) using the AffWild2 dataset under the Affective Behavior Analysis in-the-Wild (ABAW) challenge protocol. We deployed unimodal end-to-end and transfer learning approaches within a multimodal fusion system, which generated final predictions using a weighted score fusion scheme. Exploiting the proposed deep-learning-based multimodal system, we reached a test set challenge performance measure of 48.1% on the ABAW 2020 Facial Expressions challenge, which advances the first-runner-up performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.