The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2015
DOI: 10.1109/taffc.2015.2390627
|View full text |Cite
|
Sign up to set email alerts
|

Perception and Automatic Recognition of Laughter from Whole-Body Motion: Continuous and Categorical Perspectives

Abstract: Despite its importance in social interactions, laughter remains little studied in affective computing. Intelligent virtual agents are often blind to users' laughter and unable to produce convincing laughter themselves. Respiratory, auditory, and facial laughter signals have been investigated but laughter-related body movements have received less attention. The aim of this study is threefold. First, to probe human laughter perception by analyzing patterns of categorisations of natural laughter animated on a min… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 29 publications
(19 citation statements)
references
References 51 publications
0
14
0
Order By: Relevance
“…Employing computers in the analysis introduces the opportunity of automatic detection of emotions, a research topic to which the literature is devoting increasing attention, proposing and evaluating classifiers based on physiological signals (Mandryk & Atkins, 2007;Fleureau et al, 2012;Wu et al, 2010;Chittaro & Sioni, 2014;Gruebler & Suzuki, 2014) or the recording of user's behavior based on cameras, e.g. facial expressions (Eleftheriadis et al, 2015;Soleymani et al, 2016), body movement (Castellano et al, 2007) and laughter from body motion (Griffin et al, 2015); microphones, e.g. speech (Deng et al, 2014) and non-verbal sound analysis (Gupta et al, 2016); or written textual communication with other users (Li & Xu, 2014) and conversational agents (Benyon et al, 2013).…”
Section: Introductionmentioning
confidence: 99%
“…Employing computers in the analysis introduces the opportunity of automatic detection of emotions, a research topic to which the literature is devoting increasing attention, proposing and evaluating classifiers based on physiological signals (Mandryk & Atkins, 2007;Fleureau et al, 2012;Wu et al, 2010;Chittaro & Sioni, 2014;Gruebler & Suzuki, 2014) or the recording of user's behavior based on cameras, e.g. facial expressions (Eleftheriadis et al, 2015;Soleymani et al, 2016), body movement (Castellano et al, 2007) and laughter from body motion (Griffin et al, 2015); microphones, e.g. speech (Deng et al, 2014) and non-verbal sound analysis (Gupta et al, 2016); or written textual communication with other users (Li & Xu, 2014) and conversational agents (Benyon et al, 2013).…”
Section: Introductionmentioning
confidence: 99%
“…Within ILHAIRE, another strong focus did hence lay in the investigation of laughter in the body and the perception of such cues. Within ILHAIRE, Griffin and colleagues [39] analyzed participants perception of laughter from body movements. The participants task was to categories animations of natural laughter from motion capture data replayed using faceless stick figures (characters with trunk, limbs and heads simply represented by edges).…”
Section: Perception Of Bodily Portrayalsmentioning
confidence: 99%
“…Although some previous studies described the morphological attributes of laughter, it was still necessary within ILHAIRE to gather more detailed statistics related to specific motion patterns appearing during laughter [39]. These studies used recordings done within the project and including motion capture using high-end optical hardware.…”
Section: Body Movement and Gesturesmentioning
confidence: 99%
See 1 more Smart Citation
“…However, it has taken many years for other academic domains interested in human social interaction to pay sufficient attention to this important social signal. Within the domain of Affective Computing laughter is a particularly important social signal, it signals positive affect [3] and social affiliation [4] and has important conversational functions that are likely to be crucial in creating more humanoriented interactions between computers and humans [5]. It serves as an important regulator of many functional features of human social interaction regulating topics and turn-taking within conversation, and it aids in the repair of conversations [6], [7].…”
Section: Introductionmentioning
confidence: 99%