The design of an affect recognition system for socially perceptive robots relies on representative data: human-robot interaction in naturalistic settings requires an affect recognition system to be trained and validated with contextualised affective expressions, that is, expressions that emerge in the same interaction scenario of the target application. In this paper we propose an initial computational model to automatically analyse human postures and body motion to detect engagement of children playing chess with an iCat robot that acts as a game companion. Our approach is based on vision-based automatic extraction of expressive postural features from videos capturing the behaviour of the children from a lateral view. An initial evaluation, conducted by training several recognition models with contextualised affective postural expressions, suggests that patterns of postural behaviour can be used to accurately predict the engagement of the children with the robot, thus making our approach suitable for integration into an affect recognition system for a game companion in a real world scenario.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.