This paper deals with a multimodal annotation scheme dedicated to the study of gestures in interpersonal communication, with particular regard to the role played by multimodal expressions for feedback, turn management and sequencing. The scheme has been developed under the framework of the MUMIN network and tested on the analysis of multimodal behaviour in short video clips in Swedish, Finnish and Danish. The preliminary results obtained in these studies show that the reliability of the categories defined in the scheme is acceptable, and that the scheme as a whole constitutes a versatile analysis tool for the study of multimodal communication behaviour.
The results of a comparison between three different speech types-On-Talk, speaking to a computer, Off-Talk Self , speaking to oneself and Off-Talk Other, speaking to another person-uttered by subjects in a collaborative interlingual task mediated by an automatic speech-to-speech translation system, are reported here. The characteristics of the three speech types show significant differences in terms of speech rate (F2,2719 = 101.7; p < 2e − 16), and for this reason a detection method was implemented to see if they could also be detected with good accuracy based on their acoustic and biological characteristics. Acoustic and biological measures provide good results in distinguish between On-Talk and Off-Talk, but have difficulty distinguishing the sub-criteria of Off-Talk: Self and Other.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.