Children diagnosed with autism, which affects one in every 165, are thought to lack or have impairment in some representational sets of abilities. As a result, they have difficulties operating in our highly complex social environment, and are for the most part, unable to understand other people's emotions. People express their emotion states all the time, even when interacting with machines. These emotion states shape the decisions that we make, govern how we communicate with others, and affect our performance. The ability to attribute emotion states to others from their behaviour, and to use that knowledge to guide one's own actions and predict those of others is known as emotion-recognition. To allow children with autism to read and respond to the emotions of people, we propose a computer-based device called Cognitive Assistive Computational-based Emotional State Recognition to assist autistic children to understand, interpret and react to emotions of people they interact with. The system is real-time so that computation time is of vital important to enable real-time interactive training for the system to learn and analyse the human emotions for autistic children. The principal contribution of this thesis is the real time inference of a wide range of emotion states from head and facial displays in a video stream, both pre-recorded (Mind Reading DVD) and live camera. In particular, the focus is on the inference of complex emotion states (agreeing, disagreeing, encouraging, discouraging and unsure): the affective and cognitive states of mind that are not part of the set of basic emotions (in our case is neutral, joy, sad and surprise). The automated emotion state inference system is inspired by and draws on the fundamental role of emotion-recognition in communication and decision-making. The thesis describes the design, implementation and validation of a computational model of emotion-recognition. The design is based on the results of a number of experiments that we have undertaken to analyse the facial signals and dynamics of complex emotion states. In this research, a device will be developed with