Abstract:Despite its importance in social interactions, laughter remains little studied in affective computing. Intelligent virtual agents are often blind to users' laughter and unable to produce convincing laughter themselves. Respiratory, auditory, and facial laughter signals have been investigated but laughter-related body movements have received less attention. The aim of this study is threefold. First, to probe human laughter perception by analyzing patterns of categorisations of natural laughter animated on a min… Show more
“…Employing computers in the analysis introduces the opportunity of automatic detection of emotions, a research topic to which the literature is devoting increasing attention, proposing and evaluating classifiers based on physiological signals (Mandryk & Atkins, 2007;Fleureau et al, 2012;Wu et al, 2010;Chittaro & Sioni, 2014;Gruebler & Suzuki, 2014) or the recording of user's behavior based on cameras, e.g. facial expressions (Eleftheriadis et al, 2015;Soleymani et al, 2016), body movement (Castellano et al, 2007) and laughter from body motion (Griffin et al, 2015); microphones, e.g. speech (Deng et al, 2014) and non-verbal sound analysis (Gupta et al, 2016); or written textual communication with other users (Li & Xu, 2014) and conversational agents (Benyon et al, 2013).…”
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. Highlights We explore bodily sensation maps (BSMs) as a novel way to detect emotions We propose EmoPaint, a mobile app to collect BSMs and detect emotions from them A user study reveals that the app is easy to use and able to detect emotions The app improves accuracy over a traditional method: Affect Grid with Circumplex model
“…Employing computers in the analysis introduces the opportunity of automatic detection of emotions, a research topic to which the literature is devoting increasing attention, proposing and evaluating classifiers based on physiological signals (Mandryk & Atkins, 2007;Fleureau et al, 2012;Wu et al, 2010;Chittaro & Sioni, 2014;Gruebler & Suzuki, 2014) or the recording of user's behavior based on cameras, e.g. facial expressions (Eleftheriadis et al, 2015;Soleymani et al, 2016), body movement (Castellano et al, 2007) and laughter from body motion (Griffin et al, 2015); microphones, e.g. speech (Deng et al, 2014) and non-verbal sound analysis (Gupta et al, 2016); or written textual communication with other users (Li & Xu, 2014) and conversational agents (Benyon et al, 2013).…”
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. Highlights We explore bodily sensation maps (BSMs) as a novel way to detect emotions We propose EmoPaint, a mobile app to collect BSMs and detect emotions from them A user study reveals that the app is easy to use and able to detect emotions The app improves accuracy over a traditional method: Affect Grid with Circumplex model
“…Within ILHAIRE, another strong focus did hence lay in the investigation of laughter in the body and the perception of such cues. Within ILHAIRE, Griffin and colleagues [39] analyzed participants perception of laughter from body movements. The participants task was to categories animations of natural laughter from motion capture data replayed using faceless stick figures (characters with trunk, limbs and heads simply represented by edges).…”
Section: Perception Of Bodily Portrayalsmentioning
confidence: 99%
“…Although some previous studies described the morphological attributes of laughter, it was still necessary within ILHAIRE to gather more detailed statistics related to specific motion patterns appearing during laughter [39]. These studies used recordings done within the project and including motion capture using high-end optical hardware.…”
Section: Body Movement and Gesturesmentioning
confidence: 99%
“…Another proposal was made earlier [40,39] within the project where body motion was investigated for laughter type recognition among five categories (hilarious, social, awkward, fake, and non-laughter). Features characterizing hand gesture, shoulder movement, neck/spine bending, as well as kinetic energy of several upper body articulations were extracted.…”
“…However, it has taken many years for other academic domains interested in human social interaction to pay sufficient attention to this important social signal. Within the domain of Affective Computing laughter is a particularly important social signal, it signals positive affect [3] and social affiliation [4] and has important conversational functions that are likely to be crucial in creating more humanoriented interactions between computers and humans [5]. It serves as an important regulator of many functional features of human social interaction regulating topics and turn-taking within conversation, and it aids in the repair of conversations [6], [7].…”
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.