2017 International Joint Conference on Neural Networks (IJCNN) 2017
DOI: 10.1109/ijcnn.2017.7965911
|View full text |Cite
|
Sign up to set email alerts
|

Teaching emotion expressions to a human companion robot using deep neural architectures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
3
1

Relationship

5
4

Authors

Journals

citations
Cited by 27 publications
(25 citation statements)
references
References 17 publications
0
22
0
Order By: Relevance
“…These two significant works are thus related to work described here (projecting into a latent space), albeit our core affect state-space is a continuous one, and builds upon a more complex, nonlinear implementation model (deep GP). In other cases [77], [79], much like in machine learning approaches, a direct mapping via deep neural networks is pursued, from the latent space of affective expressions learnt in a bottom-up, feed-forward sweep to facial gestures, synthetic speech etc. Indeed the use of deep architectures, can lead to efficient implementation models capable to handle the multimodal nature of emotion [80].…”
Section: Discussionmentioning
confidence: 99%
“…These two significant works are thus related to work described here (projecting into a latent space), albeit our core affect state-space is a continuous one, and builds upon a more complex, nonlinear implementation model (deep GP). In other cases [77], [79], much like in machine learning approaches, a direct mapping via deep neural networks is pursued, from the latent space of affective expressions learnt in a bottom-up, feed-forward sweep to facial gestures, synthetic speech etc. Indeed the use of deep architectures, can lead to efficient implementation models capable to handle the multimodal nature of emotion [80].…”
Section: Discussionmentioning
confidence: 99%
“…The NICO robot is a humanoid robot which serves as a fully controllable robot for multimodal human-robot interaction scenarios. NICO is an abbreviation for 'Neurally-Inspired Companion,' signifying its integration into assisting tasks like object grasping [11] and serving as a reliable platform for HRI studies including emotion recognition [6] and human-robot dialogue understanding [28] as well as for the assessment of critical design factors for robots, e.g., 'likeability' or safety issues [13], which are crucially important aspects for social robot research. The robot architecture and API are configured in a modular fashion to facilitate the integration, extension and combination of task-specific algorithms.…”
Section: Data Recording With the Nico Robotmentioning
confidence: 99%
“…It has a symmetrical and abstracted child-like appearance that aims to enable intuitive humanrobot interaction while avoiding the uncanny-valley effect. Behind the surface of the head, in the eyebrow and mouth area, a programmable LED display is placed, that can display basic emotions in the form of stylized facial expressions [14], [15]. The head features two 2 Megapixel sensors with a 70-degree field of vision.…”
Section: Affective Association Modellingmentioning
confidence: 99%