2020
DOI: 10.1109/access.2020.3007109
|View full text |Cite
|
Sign up to set email alerts
|

Affective Robot Story-Telling Human-Robot Interaction: Exploratory Real-Time Emotion Estimation Analysis Using Facial Expressions and Physiological Signals

Abstract: Affective human-robot interaction is still an active area of research in part due to the great advances in artificial intelligence. Now, the design of autonomous devices that work in real therapeutic environments has become a plausible reality. Affective human-robot interaction requires a robot to analyze the emotional state of the human interlocutor and interpret emotional responses that can be used, not merely in the interaction but, for example, to provoke desired therapeutic responses. It is, therefore, ne… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(9 citation statements)
references
References 44 publications
(34 reference statements)
0
7
0
Order By: Relevance
“…Observations and analytical sensors can be used to measure the emotions of an individual using audio [115], physiological [116] visual signals [3,117]. A literature review on the use of physiological signals in HRI to detect emotions points out that the baseline of the physiological signals should be gathered before recognizing the influence of the emotions [118,119]. [120] indicates that the first encounter with a robot should take place before the experiment starts.…”
Section: Evaluation Of Responsible Hrimentioning
confidence: 99%
“…Observations and analytical sensors can be used to measure the emotions of an individual using audio [115], physiological [116] visual signals [3,117]. A literature review on the use of physiological signals in HRI to detect emotions points out that the baseline of the physiological signals should be gathered before recognizing the influence of the emotions [118,119]. [120] indicates that the first encounter with a robot should take place before the experiment starts.…”
Section: Evaluation Of Responsible Hrimentioning
confidence: 99%
“…Owing to the benefits of flexibility and practicality, studies are actively conducted in areas that require real-time emotion recognition in various situations. Val-Calvo et al [ 26 ] proposed self-designed methodologies to estimate users’ emotional state in real time. They used EDA, blood volume pressure (BVP) and EEG to analyze the statistical correlation between experienced emotions and the properties of the set of features.…”
Section: Related Workmentioning
confidence: 99%
“…The most commonly used bio-physiology signals are electroencephalogram (EEG), electrocardiogram (ECG), photoplethysmography (PPG) and electrodermal activity (EDA). Moreover, some studies using both facial expressions and bio-physiology signals achieved high accuracy in the case of emotion recognition [ 24 , 25 ] and performed well in recognizing various emotion classes [ 26 , 27 ]. All these studies are based on deep learning algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…In Val-Calvo et al ( 2020 ), an interesting analysis of the possibilities of ER in HRI was performed by using facial images, EEG, GSR, and blood pressure. In a realistic HRI scenario, a Pepper robot dynamically drives subjects' emotional responses by story-telling and multimedia stimuli.…”
Section: State Of the Artmentioning
confidence: 99%