This paper deals with the haptic affective social interaction during a greeting handshaking between a human and a humanoid robot. The goal of this work is to study how the haptic interaction conveys emotions, and more precisely, how it influences the perception of the dimensions of emotions expressed through the facial expressions of the robot. Moreover, we examine the benefits of the multimodality (i.e., visuo-haptic) over the monomodality (i.e., visual-only and haptic-only). The experimental results with Meka robot show that the multimodal condition presenting high values for grasping force and joint stiffness are evaluated with higher values for the arousal and dominance dimensions than during the visual condition. Furthermore, the results corresponding to the monomodal haptic condition showed that participants discriminate well the dominance and the arousal dimensions of the haptic behaviours presenting low and high values for grasping force and joint stiffness.
In the context of designing multimodal social interactions for Human–Computer Interaction and for Computer–Mediated Communication, we conducted an experimental study to investigate how participants combine voice expressions with tactile stimulation to evaluate emotional valence (EV). In this study, audio and tactile stimuli were presented separately, and then presented together. Audio stimuli comprised positive and negative voice expressions, and tactile stimuli consisted of different levels of air jet tactile stimulation performed on the arm of the participants. Participants were asked to evaluate communicated EV on a continuous scale. Information Integration Theory was used to model multimodal valence perception process. Analyses showed that participants generally integrated both sources of information to evaluate EV. The main integration rule was averaging rule. The predominance of a modality over the other modality was specific to each individual.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.