2014
DOI: 10.1587/transinf.e97.d.2008
|View full text |Cite
|
Sign up to set email alerts
|

Analyzing Perceived Empathy Based on Reaction Time in Behavioral Mimicry

Abstract: SUMMARYThis study analyzes emotions established between people while interacting in face-to-face conversation. By focusing on empathy and antipathy, especially the process by which they are perceived by external observers, this paper aims to elucidate the tendency of their perception and from it develop a computational model that realizes the automatic inference of perceived empathy/antipathy. This paper makes two main contributions. First, an experiment demonstrates that an observer's perception of an interac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 47 publications
0
3
0
Order By: Relevance
“…Computational emotion recognition using facial cues has been incorporated into the study of empathic behavior in group settings. For instance, Kumano et al (2011Kumano et al ( , 2014Kumano et al ( , 2015) conducted a series of experiments to see if empathic interactions could be predicted based on facial data from video recordings of four-person meetings. Their Naive Bayes Network Model was able to predict empathy state given facial expression information across time and improved when parameters such as reaction time in mirrored expression between interlocutors and head gesture annotations were added.…”
Section: The Frontier: Behavioral Cuesmentioning
confidence: 99%
“…Computational emotion recognition using facial cues has been incorporated into the study of empathic behavior in group settings. For instance, Kumano et al (2011Kumano et al ( , 2014Kumano et al ( , 2015) conducted a series of experiments to see if empathic interactions could be predicted based on facial data from video recordings of four-person meetings. Their Naive Bayes Network Model was able to predict empathy state given facial expression information across time and improved when parameters such as reaction time in mirrored expression between interlocutors and head gesture annotations were added.…”
Section: The Frontier: Behavioral Cuesmentioning
confidence: 99%
“…Kumano et al extended this framework by investigating reaction timing and facial expression congruence information [36]. They demonstrated that these two aspects were related to the annotated empathy labels ( e.g., a congruent but delayed reaction in facial expression is less likely to have an empathy label).…”
Section: Empathy Analysismentioning
confidence: 99%
“…Existing works have pulled audio recordings from a few large scale psychotherapy studies totaling to thousands of sessions [4,55,39,40,67,72], however, only a small fraction was finely annotated — in terms of both psychological assessments of mental and behavioral states, and having time-marked transcripts to train and validate automatic speech and language processing systems. The work by Kumano et al has employed a small dataset totaling to a few hours [37,36,35,38]. …”
Section: Challenges and Future Directionsmentioning
confidence: 99%