Abstract. In this paper, the development of a robotic head able to move and show different emotions is addressed. The movement and emotion generation system has been designed following the human facial musculature. Starting from the Facial Action Coding System (FACS), we have built a 26 actions units model that is able to produce the most relevant movements and emotions of a real human head. The whole work has been carried out in two steps. In the first step, a mechanical skeleton has been designed and built, in which the different actuators have been inserted. In the second step, a two-layered silicon skin has been manufactured, on which the different actuators have been inserted following the real muscle-insertions, for performing the different movements and gestures. The developed head has been integrated in a high level behavioural architecture, and pilot experiments with 10 users regarding emotion recognition and mimicking have been carried out.