2017
DOI: 10.1007/978-3-319-67401-8_32
|View full text |Cite
|
Sign up to set email alerts
|

The Expression of Mental States in a Humanoid Robot

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 4 publications
0
4
0
Order By: Relevance
“…In a different context, Edvardsen et al (2020) incorporated classes of neurons from hippocampus to design a navigation strategy in cluttered environments. Some other works have focused on the development of prosthetics to assist the locomotion of animals (von Zitzewitz et al, 2016) and even on the improvement of human-robot interaction by designing systems that control the levels of awareness in humanoid robots (Lindberg et al, 2017;Balkenius et al, 2018).…”
Section: Neurorobotics and Neural Disordersmentioning
confidence: 99%
“…In a different context, Edvardsen et al (2020) incorporated classes of neurons from hippocampus to design a navigation strategy in cluttered environments. Some other works have focused on the development of prosthetics to assist the locomotion of animals (von Zitzewitz et al, 2016) and even on the improvement of human-robot interaction by designing systems that control the levels of awareness in humanoid robots (Lindberg et al, 2017;Balkenius et al, 2018).…”
Section: Neurorobotics and Neural Disordersmentioning
confidence: 99%
“…Another project investigated how the Epi head could animate facial expression and expressions of internal states. 30,32 These expressions combined the LEDs in the eyes with movements of the head.…”
Section: Epi As a Student Platformmentioning
confidence: 99%
“…We also tested to what extent the robot face could be used to express different internal and emotional states. 30 We combined head movements with eye colour changes to try to convey thinking, angry, happy, confused and sad. The majority of participants were able to identify thinking, happy and confused.…”
Section: Evaluating the Designmentioning
confidence: 99%
“…Examples might be memory systems 8 or models of visual attention. The robot was used with Ikaros to study emotional displays 54 and interaction with children in a tutoring situation. 55…”
Section: Robots Using Ikarosmentioning
confidence: 99%