2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) 2019
DOI: 10.1109/ro-man46459.2019.8956353
|View full text |Cite
|
Sign up to set email alerts
|

Study of Empathy on Robot Expression Based on Emotion Estimated from Facial Expression and Biological Signals

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…Face expression and voice intonation can be controlled, and may thus conceal the real emotion. Therefore, Kurono et al [ 16 , 17 ] compared the impression of the robot based on the emotion estimated from controllable and uncontrollable expressions. The controllable expressions were derived from facial expression recognition, whereas the uncontrollable expressions were analyzed from brain waves and heart rate measurement.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Face expression and voice intonation can be controlled, and may thus conceal the real emotion. Therefore, Kurono et al [ 16 , 17 ] compared the impression of the robot based on the emotion estimated from controllable and uncontrollable expressions. The controllable expressions were derived from facial expression recognition, whereas the uncontrollable expressions were analyzed from brain waves and heart rate measurement.…”
Section: Introductionmentioning
confidence: 99%
“…In studies by Kurono et al [ 16 , 17 ], the authors compared the impression rating of robot facial expression, which was synchronized from the emotion of the participant. In their work, the emotion was estimated from facial expression and biological signals (heart rate and brain waves).…”
Section: Introductionmentioning
confidence: 99%