2019
DOI: 10.1007/978-3-030-19591-5_18
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Emotional Recognition for Sociable Robotics Based on Deep Neural Networks Ensemble

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 12 publications
(10 citation statements)
references
References 14 publications
0
10
0
Order By: Relevance
“…HRI was at the core of [83] that introduces a realtime emotion recognition system using a YOLObased facial detection system and an ensemble convolutional neural network. Another work benefited the field of HRI by facing a broadened understanding of brain emotional encoding in order to improve the capabilities of robots to fully engage with the user's emotional reactions [84].…”
Section: Human-machine Interaction and Health-care Applicationsmentioning
confidence: 99%
See 2 more Smart Citations
“…HRI was at the core of [83] that introduces a realtime emotion recognition system using a YOLObased facial detection system and an ensemble convolutional neural network. Another work benefited the field of HRI by facing a broadened understanding of brain emotional encoding in order to improve the capabilities of robots to fully engage with the user's emotional reactions [84].…”
Section: Human-machine Interaction and Health-care Applicationsmentioning
confidence: 99%
“…The papers have detected affect from facial cues by robots working in real-world scenarios [83] and speech in order to elicit stress through a set of online interviews, as well as to establish a standard speech corpus to assess emotions across multiple languages [85]. Hence, a number of papers introduced solutions based on the acquisition and processing of physiological signals.…”
Section: Human-machine Interaction and Health-care Applicationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Additionally, Benamara et al [ 31 ] performed an algorithm for a sociable robot to recognize emotions when interacting with a person in real time. The algorithm detects the face by employing a YOLO framework; then, it converts the resulting image in grayscale, normalized in the range [0, 1].…”
Section: Algorithms Used For Face Recognition and Trackingmentioning
confidence: 99%
“…The detected face is then preprocessed: the image is cropped to extract the region of interest, converted from RGB to grayscale, resized to a resolution of 48 × 48 pixels, and finally normalized into a [0,1] range. In the second stage, the preprocessed image is fed into a low level feature extraction layer and a deep convolutional ensemble of neural networks to obtain the emotion classification [42], [43]. This ensemble model was trained on the FER-2013 database [44] achieving a 72.47% accuracy on the test set.…”
Section: Facial Expression Recognitionmentioning
confidence: 99%