Kinaesthetic interaction using force-feedback devices is promising in virtual reality. However, the devices are currently not suitable for interactions within large virtual spaces because of their limited workspace. We developed a novel gaze-based kinaesthetic interface that employs the user’s gaze to relocate the device workspace. The workspace switches to a new location when the user pulls the mechanical arm of the device to its reset position and gazes at the new target. This design enables the robust relocating of device workspace, thus achieving an infinite interaction space, and simultaneously maintains a flexible hand-based kinaesthetic exploration. We compared the new interface with the scaling-based traditional interface in an experiment involving softness and smoothness discrimination. Our results showed that the gaze-based interface performs better than the traditional interface, in terms of efficiency and kinaesthetic perception. It improves the user experience for kinaesthetic interaction in virtual reality without increasing eye strain.
Kinesthetic interaction typically employs force-feedback devices for providing the kinesthetic input and feedback. However, the length of the mechanical arm limits the space that users can interact with. To overcome this challenge, a large control-display (CD) gain (>1) is often used to transfer a small movement of the arm to a large movement of the onscreen interaction point. Although a large gain is commonly used, its effects on task performance (e.g., task completion time and accuracy) and user experience in kinesthetic interaction remain unclear. In this study, we compared a large CD gain with the unit CD gain as the baseline in a task involving kinesthetic search. Our results showed that the large gain reduced task completion time at the cost of task accuracy. Two gains did not differ in their effects on perceived hand fatigue, naturalness, and pleasantness, but the large gain negatively influenced user confidence of successful task completion.
As virtual and mixed reality hardware systems become more mainstream, users are spending substantial amounts of time in simulated environments. Unlike the transition from desktop to mobile devices, VR/XR utilizes 360 wrap-around space which can be challenging to master even for experienced users. Tasks and tools commonly utilized in 2D environments within mobile and personal computing devices may not always be intuitive for VR space. For that reason, it is important to study and evaluate which common graphical user interface (GUI) techniques can be extended to VR/XR and how the efficiency of common 2D tools need to be improved within a 360-degree space. In this study authors explore six commonly used GUI tools and evaluate them in a VR environment. The research looks at how participants deconstruct 360-degree GUI tasks by identifying the location of the controls, navigating through the VR space to the relevant area and finally adjusting the GUI controls as instructed. The study looks at augmenting the interaction by providing vibrotactile navigation cues along with kinaesthetic and temperature-based feedback to complete the GUI tasks. Comparing to conventional visual only techniques that are currently being used in VR environments, vibrotactile, kinaesthetic and temperature feedback provided faster task completion times and more pleasant user experience. Participants also rated the additional feedback channels as more informative and less distracting within the virtual environment. Overall results show that participants preferred the novel use of haptic feedback for most of the GUI controls assessed within the study. Moreover, results also show that some more complex GUI controls (i.e., dial, menus, and lists) may not be best suited for VR 360-degree interaction, using visual only information channels, especially with non-robust inside-out hand tracking techniques. Additional research is needed to validate these results across different VR/XR hardware and simulated environments, however, current results point towards utilizing multi-modal and multi-technology interaction tools to create more immersive and intuitive 360 virtual spaces across a wide range of VR/XR devices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.