This paper presents an adaptive physical environment that allows children with severe autism to successfully interact with multimodal stimuli, giving them a sense of control of the interaction and, hence, providing them with a sense of agency. This has been an extremely important effort for two main reasons: 1) This user group cannot be typified, hence making the design of an interactive system to fit all the spectrum of individuals a very complex task; 2) each individual PAS (Person on the Autistic Spectrum) user must be able to develop himself within the environment according to his own capacities and potentiality. Qualitative evaluation by psychologists shows very good results and sketches an encouraging future for research on these environments.
Embodied agents are often designed with the ability to simulate human emotion. This paper investigates the psychological impact of simulated emotional expressions on computer users with a particular emphasis on how mismatched facial and audio expressions are perceived (e.g. a happy face with a concerned voice). In a within-subjects repeated measures experiment (N = 68), mismatched animations were perceived as more engaging, warm, concerned and happy when a happy or warm face was in the animation (as opposed to a neutral or concerned face) and when a happy or warm voice was in the animation (as opposed to a neutral or concerned voice). The results appear to follow cognitive dissonance theory as subjects attempted to make mismatched expressions consistent on both the visual and audio dimensions of animations, resulting in confused perceptions of the emotional expressions. Design implications for affective embodied agents are discussed and future research areas identified.
Augmented Reality (AR) is getting close to real use cases, which is driving the creation of innovative applications and the unprecedented growth of Head-Mounted Display (HMD) devices in consumer availability. However, at present there is a lack of guidelines, common form factors and standard interaction paradigms between devices, which has resulted in each HMD manufacturer creating their own specifications. This paper presents the first experimental evaluation of two AR HMDs evaluating their interaction paradigms, namely we used the HoloLens v1 (metaphoric interaction) and Meta2 (isomorphic interaction). We report on precision, interactivity and usability metrics in an object manipulation task-based user study. 20 participants took part in this study and significant differences were found between interaction paradigms for translation tasks, where the isomorphic mapped interaction outperformed the metaphoric mapped interaction in both time to completion and accuracy, while the contrary was found for the resize task. From an interaction perspective, the isomorphic mapped interaction (using the Meta2) was perceived as more natural and usable with a significantly higher usability score and a significantly lower task-load index. However, when task accuracy and time to completion is key mixed interaction paradigms need to be considered.
Influence of interaction fidelity and rendering quality on perceived user experience have been largely explored in Virtual Reality (VR). However, differences in interaction choices triggered by these rendering cues have not yet been explored. We present a study analysing the effect of thermal visual cues and contextual information on 50 participants' approach to grasp and move a virtual mug. This study comprises 3 different temperature cues (baseline empty, hot and cold) and 4 contextual representations; all embedded in a VR scenario. We evaluate 2 different hand representations (abstract and human) to assess grasp metrics. Results show temperature cues influenced grasp location, with the mug handle being predominantly grasped with a smaller grasp aperture for the hot condition, while the body and top were preferred for baseline and cold conditions.
We present a new application ("Sakura") that enables people with physical impairments to produce creative visual design work using a multimodal gaze approach. The system integrates multiple features tailored for gaze interaction including the selection of design artefacts via a novel grid approach, control methods for manipulating canvas objects, creative typography, a new color selection approach, and a customizable guide technique facilitating the alignment of design elements. A user evaluation (N=24) found that nondisabled users were able to utilize the application to complete common design activities and that they rated the system positively in terms of usability. A follow-up study with physically impaired participants (N=6) demonstrated they were able to control the system when working towards a website design, rating the application as having a good level of usability. Our research highlights new directions in making creative activities more accessible for people with physical impairments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.