Mixed prototyping is an industrial practice that combines virtual and real components in order to realize a prototype of a product used to evaluate and assess the design choices. Recently, mixed prototypes have been also used to assess the usability of products interface. This particular application arises several problems related to the devices and the interaction techniques that, better than others, allow a natural interaction with the mixed prototype. This paper presents a mixed reality environment for usability evaluation that deals with two specific problems of this kind of application: the occlusion between real and virtual objects and the interpretation of the user’s gestures while he/she is interacting with the elements of the product interface. In particular we propose a technique able to manage both the problems by using only commodity hardware and video processing algorithms, thus avoiding the use of expensive data-gloves and tracking devices. The proposed approach has been validated through a user study addressed to establish whether and to what extent the augmented reality devices and the techniques proposed may distort the usability assessment of the product. Moreover, the user study compares the mixed reality environment adopted in this study with a classical virtual reality set-up.
The importance of participatory design (PD) is progressively increasing thanks to its capacity to explore a wide variety of concepts, thus increasing the opportunity to create a successful product. In fact the design process should not be a solo activity, as designers often need inputs and other points of view, especially from end-users. According to the ultimate idea of PD, end-users are actively involved in the various activities of the product development to ensure that their needs and desires are satisfied. This paper presents a novel approach to the participatory design of product interfaces in a user-centered design (UCD) process. The approach is based on an interactive tool that allows end-users to design custom user interfaces of household appliances taking advantage of their own needs and experiences. The tool incorporates the analytical and more abstract knowledge of the designers codified in the form of aesthetical, technological and manufacturing constraints (i.e., limitations in the number and geometry of interface components, a limited number of colors, a discretization of the area where interface widgets are placed). This solution allows the end-users to directly design their favorite interface without the interference of any other subject. Through an accurate analysis of the choices done by the users, the designers are able to access to the deepest level of the users’ expression in order to catch their latent needs and tacit knowledge. The tool has been designed in order to make possible to immediately perform usability tests on the designed interface by using a Mixed Reality prototype. The paper describes the development of the tool and proposes a methodology that has been specifically addressed to include this tool in a design process based on UCD principles. Both the tool and the methodology are presented through the description of a case-study related to the redesign of a washing machine dashboard. Experimental results show that the proposed tool can be an effective support to design product interfaces during PD sessions.
The validation of a product interface is often a critical issue in the design process. Virtual reality and mixed reality (MR) are able to enhance the interactive simulation of the product human-machine interface (HMI), as these technologies allow engineers to directly involve end users in the usability assessment. This paper describes a MR environment specifically addressed to the usability evaluation of a product interface, which allows the simulation of the HMI behaviour using the same models and the same software employed by engineers during the design phase. Our approach is based on the run-time connection between the visualisation software and the simulators used for product design and analysis. In particular, we use Matlab/Simulink to model and simulate the product behaviour, and Virtools to create the interactive MR environment in which the end user can test the product. Thanks to this architecture, any modification done on the behaviour models is immediately testable in MR.
The use of haptic devices in Virtual Reality applications makes the interaction with the digital objects easier, by involving the sense of touch in the simulation. The most widespread devices are stylus-based, so the user interacts with the virtual world via either a tool or a stylus. These kinds of devices have been effectively used in several virtual prototyping applications, in order to allow the users to easily interact with the digital model of a product. Among the several open issues related to these applications, there is the choice of the set-up and of the techniques adopted to combine the visual and the haptic stimuli. This paper presents the comparison of three different solutions specifically studied for virtual prototyping applications and in particular for usability assessment. The first is a simple desktop configuration where the user looks at a screen, and visual and haptic stimuli are presented in a de-located manner. The second is a HMD based set-up where the user has a more natural first-person immersive interaction. The third requires a video-see-trough HMD in order to augment the virtual scene with the visualization of the real user’s hand. The test realized with the users on these three different setups have been finalized to study the effect of two different factors that are crucial for the effectiveness and the user-friendliness of the interaction. One is the perception of the visual and haptic stimuli in a collocated manner; the other is the visualization of his/her own hand during the interaction with the virtual product.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.