2020
DOI: 10.1177/1729881420911498
|View full text |Cite
|
Sign up to set email alerts
|

Epi: An open humanoid platform for developmental robotics

Abstract: Epi is a humanoid robot developed by Lund University Cognitive Science Robotics Group. It was designed to be used in experiments in developmental robotics and has proportions to give a childlike impression while still being decidedly robotic. The robot head has two degrees of freedom in the neck and each eye can independently move laterally. There is a camera in each eye to make stereovision possible. The arms are designed to resemble those of a human. Each arm has five degrees of freedom, three in the shoulde… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 27 publications
0
5
0
Order By: Relevance
“…On the other hand, multitask robots navigate dynamic systems and can perform multiple tasks simultaneously. Both these types of robots and systems are prevalent in different realworld applications, such as medical robots (Davies et al, 2000;, prosthetic arms (Fazeli et al, 2019), humanoid platforms (Johansson et al, 2020), and automated driving (Spielberg et al, 2019). These applications offer critical opportunities to advance the design of robot systems further.…”
Section: Advances In Neuro-robotics and Robotic Failures Go Hand In Handmentioning
confidence: 99%
“…On the other hand, multitask robots navigate dynamic systems and can perform multiple tasks simultaneously. Both these types of robots and systems are prevalent in different realworld applications, such as medical robots (Davies et al, 2000;, prosthetic arms (Fazeli et al, 2019), humanoid platforms (Johansson et al, 2020), and automated driving (Spielberg et al, 2019). These applications offer critical opportunities to advance the design of robot systems further.…”
Section: Advances In Neuro-robotics and Robotic Failures Go Hand In Handmentioning
confidence: 99%
“…Towards this end we have designed the robot Epi (Figure 1), which is an attempt at an honest humanoid design in the sense that it is clear that it is a robot while it still tries to mimic the details of human-human interaction and reproduce a number of subtle non-verbal signals (Johansson et al, 2020). The overall design of the robot is in no way unique.…”
Section: Toward An Honest Robot Designmentioning
confidence: 99%
“…The physically animated pupils of Epi can be used to communicate with humans in a way that influences them in a unconscious way (Johansson et al, 2020). The control system uses a model of the brain systems involved in pupil control to let the pupil size reflect a number of inner processes including emotional and cognitive functions (Johansson and Balkenius, 2016;Balkenius et al, 2019).…”
Section: Toward An Honest Robot Designmentioning
confidence: 99%
“…The robot was able to successfully interact in real time with a diverse group of people, accurately detecting a person's facial expression and computing the appropriate emotional response, happy for smiling, distressed for frowning, and neutral for a neutral FIGURE 13 | Examples of neurorobotics increasing AI understandability through interaction with people. (A) Epi, a humanoid child-like robot with the ability to change iris color and pupil size (Johansson et al, 2020). (B) Berenson, a humanoid robot capable of making facial expressions in line with its emotions regarding different art pieces (Pereira, 2016).…”
Section: Neuromorphic Robotsmentioning
confidence: 99%
“…The physical display of arousal works as a social cue, affecting interactions with other agents. Through a special eye design involving circles of overlapping blades as irises (Johansson et al, 2020), they showed that a robot can display mental state through stages of alertness and arousal during problem solving and decision making ( Figure 13A). An image of the robot is seen in Figure 13B.…”
Section: Affective Cognitionmentioning
confidence: 99%