Interspeech 2017 2017
DOI: 10.21437/interspeech.2017-1395
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Engagement and User Experience with a Laughter Responsive Social Robot

Abstract: We explore the effect of laughter perception and response in terms of engagement in human-robot interaction. We designed two distinct experiments in which the robot has two modes: laughter responsive and laughter non-responsive. In responsive mode, the robot detects laughter using a multimodal real-time laughter detection module and invokes laughter as a backchannel to users accordingly. In non-responsive mode, robot has no utilization of detection, thus provides no feedback. In the experimental design, we use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
4
2

Relationship

3
7

Authors

Journals

citations
Cited by 20 publications
(12 citation statements)
references
References 19 publications
0
12
0
Order By: Relevance
“…facial movement, expression, head pose) [34,36], conversational behaviors (e.g. voice activity, adjacency pair, backchannel, turn length) [18,35,37], laughing [38], and posture [39]. Engagement recognition modules based on the multi-modal features were implemented in agent systems and empirically tested with real users [36].…”
Section: B) Engagement Recognitionmentioning
confidence: 99%
“…facial movement, expression, head pose) [34,36], conversational behaviors (e.g. voice activity, adjacency pair, backchannel, turn length) [18,35,37], laughing [38], and posture [39]. Engagement recognition modules based on the multi-modal features were implemented in agent systems and empirically tested with real users [36].…”
Section: B) Engagement Recognitionmentioning
confidence: 99%
“…We take the engagement level of WoZ setup experiments as a gold standard in JOKER dataset. We evaluate the autonomous setup experiments by comparing with the gold standard engagement measurements as in [17].…”
Section: B Engagement Measuresmentioning
confidence: 99%
“…summarizes the issues regarding engagement in human-agent interactions, emphasizing its importance and indicating the growing interest of researchers in the field [1]. Backchannels like non-verbal gestures (nods and smiles), non-verbal vocalizations (mm, uh-huh, laughs) and verbal expressions (yes, right) are an important aspect of engagement and have been shown to promote engagement and interest levels of the user [2,3]. Researchers have mainly focused on rule-based back-channel generation [4,5] or data-driven unsupervised methods [6].…”
Section: Introductionmentioning
confidence: 99%