Emotions associated with different textures during touchHaptics plays an important role in emotion perception. However, most studies of the affective aspects of haptics have investigated emotional valence rather than emotional categories. In the present study, we explored the associations of different textures with six basic emotions: fear, anger, happiness, disgust, sadness and surprise. Participants touched twenty-one different textures and evaluated them using six emotional scales. Additionally, we explored whether individual differences in participants' levels of alexithymia are related to the intensity of emotions associated with touching the textures. Alexithymia is a trait related to difficulties in identifying, describing and communicating emotions to others. The findings show that people associated touching different textures with distinct emotions. Textures associated with each of the basic emotions were identified.The study also revealed that a higher alexithymia level corresponds to a higher intensity of associations between textures and the emotions of disgust, anger and sadness.
Recent studies suggest that video recordings of human facial expressions are perceived differently than linear morphing between the first and last frames of these records. Also, observers can differentiate dynamic expressions presented in normal versus time-reversed frame orders. To date, the simultaneous influence of dynamics (natural or linear) and timeline (normal or reversed) has not yet been tested on a wide range of dynamic emotional expressions and the transitions between them. We compared the perception of dynamic transitions between basic emotions in realistic (human-posed) and artificial (linearly morphed) stimuli which were presented in reversed or non-reversed order. The nonlinearity of realistic stimuli was demonstrated by automated facial structure analysis. The results of the behavioral study revealed that the recognition of emotions in time-reversed stimuli significantly differed from recognition of the normally presented ones, and this difference was substantially higher for videos of a dynamic human face than for linear morphs. Emotions displayed at the end of the transitions were recognized better than the first-frame emotions in all types of stimuli except in the time-reversed videos, which showed a similar recognition rate for both the starting and ending emotions. Our findings suggest that nonlinearity, which is present in a realistic facial display but absent in linear morphing, is an important cue for emotion perception, and that unnatural perceptual conditions (inversion in time) make the recognition of emotions more difficult. These results confirm the ability of the human visual system to use subtle dynamic cues on an interlocutor's face, and reveal its sensitivity to the timeline organization of the displayed emotions.
This study investigates systematic links between haptic perception and multimodal cinema perception. It differs from previous research conducted on cross-modal associations as it focuses on a complex intermodal stimulus, close to one people experience in reality: cinema. Participants chose materials that are most/least consistent with three-minute samples of films with elements of beauty and ugliness. We found that specific materials are associated with certain films significantly different from chance. Silk was associated with films including elements of beauty, while sandpaper was associated with films including elements of ugliness. To investigate the nature of this phenomenon, we tested the mediation effect of emotional/semantic representations on cinema–haptic associations. We found that affective representations at least partly explain the cross-modal associations between films and materials.
We present three experiments investigating the perceptual adaptation to dynamic facial emotional expressions. Dynamic expressions of six basic emotions were obtained by video recording of a poser’s face. In Experiment 1 participants (n=20) evaluated the intensity of 6 emotions, neutral state, genuineness and naturalness of dynamic expressions. The validated stimuli were further used as adaptors in Experiments 2 and 3 aimed at exploring the structure of facial expressions perceptual space by adaptation effects. In Experiment 2 participants (n=16) categorized neutral/emotion morphs after adaptation to dynamic expressions. In Experiment 3 (n=26) the task of the first stage was to categorize static frames derived from video records of the poser. Next individual psychometric functions were fitted for each participant and each emotion, to find the frame with emotion recognized correctly in 50% trials. These latter images were presented on the second stage in adaptation experiment, with dynamic video records as adaptors. Based on the three experiments, we found that facial expressions of happiness and sadness are perceived as opponent emotions and mutually facilitate the recognition of each other, whereas disgust and anger, and fear and surprise are perceptually similar and reduce the recognition accuracy of each other. We describe the categorical fields of dynamic facial expressions and of static images of initial phases of expression development. The obtained results suggest that dimensional and categorical approaches to perception of emotions are not mutually exclusive and probably describe different stages of face information processing. The study was supported by the Russian Foundation for Basic Research, project № 15-36-01281 “Structure of dynamic facial expressions perception”.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.