EyesWeb XMI (for eXtended Multimodal Interaction) is the new version of the well-known EyesWeb platform. It has a main focus on multimodality and the main design target of this new release has been to improve the ability to process and correlate several streams of data. It has been used extensively to build a set of interactive systems for performing arts applications for Festival della Scienza 2006, Genoa, Italy. The purpose of this paper is to describe the developed installations as well as the new EyesWeb features that helped in their development.
KeywordsEyesWeb, multimodal interactive systems, performing arts.
This paper addresses the use of a remote interactive platform to support home-based rehabilitation for children with motor and cognitive impairment. The interaction between user and platform is achieved on customizable full-body interactive serious games (exergames). These exergames perform real-time analysis of multimodal signals to quantify movement qualities and postural attitudes. Interactive sonification of movement is then applied for providing a real-time feedback based on "aesthetic resonance" and engagement of the children. The games also provide log file recordings therapists can use to assess the performance of the children and the effectiveness of the games. The platform allows the customization of the games to address the children's needs. The platform is based on the EyesWeb XMI software, and the games are designed for home usage, based on Kinect for Xbox One and simple sensors including 3-axis accelerometers available in low-cost Android smartphones.
We ran the first Affective Movement Recognition (AffectMove) challenge that brings together datasets of affective bodily behaviour across different real-life applications to foster work in this area. Research on automatic detection of naturalistic affective body expressions is still lagging behind detection based on other modalities whereas movement behaviour modelling is a very interesting and very relevant research problem for the affective computing community. The AffectMove challenge aimed to take advantage of existing body movement datasets to address key research problems of automatic recognition of naturalistic and complex affective behaviour from this type of data. Participating teams competed to solve at least one of three tasks based on datasets of different sensors types and reallife problems: multimodal EmoPain dataset for chronic pain physical rehabilitation context, weDraw-1 Movement dataset for maths problem solving settings, and multimodal Unige-Maastricht Dance dataset. To foster work across datasets, we also challenged participants to take advantage of the data across datasets to improve performances and also test the generalization of their approach across different applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.