This paper reports on the lessons learnt during the application of a methodology to develop intelligent environments. One important feature of the methodology is that of being strongly user-centred, and the authors report on how that interaction with users took place and how it continuously shaped our project aspirations and outcomes. The methodology was applied to a project which aimed at helping people with Down's Syndrome and those with similar conditions and needs, to be more included in society. The project was developed by a consortium of commercial, academic, and end user supporting organizations. The paper elaborates on what type of stakeholders engaging activities were considered and how these were distributed along the lifetime of the project and their impact.
Accurate distance perception and natural interactions are mandatory conditions when training precision aiming tasks in VR. However, many factors specific to virtual environments (VEs) lead to differences in the way users execute a motor task in VR versus the real world. To investigate these differences, the authors performed a study on basketball beginners' free-throw performance in VEs under different visual conditions. Although the success rate is not statistically different, some adaptations occurred in the way the users performed the task, depending on the visual conditions. In the third-person perspective visual condition, the release parameters indicate that the users more accurately estimated distance to target. Adding visual guidance information (gradual depth information showing the ideal ball trajectory) also led to more natural motor behavior. The final aim of this study was to develop a reliable basketball free-throw training system in VEs, so the authors compared beginners' performances in VR with experts' models of performance. Their results show that most of the performance variables tended to evolve closer to the experts' performance during the training in the VE.
Serious games are becoming an alternative educational method in a variety of fields because of their potential to improve the quality of learning experiences and to facilitate knowledge acquisition and content understanding. Moreover, entertainment-driven learners are more easily motivated to benefit from the learning process through meaningful activities defined in a game context. Interfacing educational computer games with multisensorial interfaces allows for a seamless integration between virtual and physical environments. Multisensorial cues can improve memory and attention and increase the cognitive and sensory-motor performance. Despite of the increasing knowledge in sensory processes, multisensory experiences and interactions in computer based instruction remain insufficiently explored and understood. In this paper, we present a multisensory educational gameFragrance Channel -and we investigate how enabling olfaction can contribute to users' learning performance, engagement and quality of experience. We compare results obtained after experiencing Fragrance Channel in the presence and absence of olfactory feedback on both a mobile and a PC. A knowledge test administered before and immediately after showed that our proposed educational game led to an improvement of performance in all the explored conditions. Subjective measurements carried out after the olfactory experience showed that students enjoyed the scenario and appreciated it as being relevant.
Mulsemedia—multiple sensorial media—makes possible the inclusion of layered sensory stimulation and interaction through multiple sensory channels. The recent upsurge in technology and wearables provides mulsemedia researchers a vehicle for potentially boundless choice. However, in order to build systems that integrate various senses, there are still some issues that need to be addressed. This review deals with mulsemedia topics that remain insufficiently explored by previous work, with a focus on the multi-multi (multiple media-multiple senses) perspective, where multiple types of media engage multiple senses. Moreover, it addresses the evolution of previously identified challenges in this area and formulates new exploration directions.
In recent years, the emerging immersive technologies (e.g. Virtual/Augmented Reality, multisensorial media) bring brand-new multi-dimensional e ects such as 3D vision, immersion, vibration, smell, air ow, etc. to gaming, video entertainment and other aspects of human life. is paper reports results from an European Horizon 2020 research project on the impact of multisensoral media (mulsemedia) on educational learner experience. A mulsemediaenhanced test-bed was developed to perform delivery of video content enhanced with haptic, olfaction and air ow e ects. e results of the quality rating and questionnaires show signi cant improvements in terms of mulsemedia-enhanced teaching.
Abstract-The ability to commute and travel alone is an important skill that enables people to be more independent, and integrated with society. People with Down's Syndrome often experience low social integration, and low degree of independence. As part of the European Commission funded POSEIDON project, we want to explore how context-aware, and assistive technology can enable users with Down's Syndrome be more independent, including the ability to commute alone to a place of interest. In this paper, we report on our current progress in developing navigational services within the context of the POSEIDON project. We carried out a semi-structured qualitative evaluation of an early version of our navigational services with 6 individuals with Down's Syndrome, and report on our findings.
Multisensory experiences have been increasingly applied in Human-Computer Interaction (HCI). In recent years, it is commonplace to notice the development of haptic, olfactory, and even gustatory displays to create more immersive experiences. Companies are proposing new additions to the multisensory world and are unveiling new products that promise to offer amazing experiences exploiting mulsemedia—multiple sensorial media—where users can perceive odors, tastes, and the sensation of wind blowing against their face. Whilst researchers, practitioners and users alike are faced with a wide range of such new devices, relatively little work has been undertaken to summarize efforts and initiatives in this area. The current article addresses this shortcoming in two ways: first, by presenting a survey of devices targeting senses beyond that of sight and hearing and, second, by describing an approach to guide newcomers and experienced practitioners alike to build their own mulsemedia environment, both in a desktop setting and in an immersive 360° environment.
Full bibliographic details must be given when referring to, or quoting from full items including the author's name, the title of the work, publication details where relevant (place, publisher, date), pagination, and for theses or dissertations the awarding institution, the degree type awarded, and the date of the award.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.