Figure 1. Applications enabled by SonicSpray: (a) SonicSpray is visualising a graphical clock in mid-air by laterally oscillating its narrow mist, (b) game of whack a mole, (c) projection of butterfly that is moving in 3D spatial space, (d) a visual of a person in a video call application. ABSTRACTPermeable, mid-air displays, such as those using fog or water mist are limited by our ability to shape and control the aerosol and deal with two major issues: (1) the size and complexity of the system, and (2) the creation of laminar flow, to retain display quality. Here we present SonicSpray, a technique using ultrasonic Bessel beams to create reconfigurable mid-air displays. We build a prototype from low-cost, off-the-shelf parts. We explore the potential and limitations of SonicSpray to create and redirect laminar flows of fog. We demonstrate a working prototype that precisely controls laminar aerosols through only 6x6 ultrasound transducers array. We describe the implementation steps to build the device, verify the control and projection algorithm for the display, and evaluate its performance. We finally report our exploration of several useful applications, in learning, entertainment and arts.
3D selection in dense VR environments (e.g., point clouds) is extremely challenging due to occlusion and imprecise mid-air input modalities (e.g., 3D controllers and hand gestures). In this paper, we propose "Slicing-Volume", a hybrid selection technique that enables simultaneous 3D interaction in mid-air, and a 2D penand-tablet metaphor in VR. Inspired by well-known slicing plane techniques in data visualization, our technique consists of a 3D volume that encloses target objects in mid-air, which are then projected to a 2D tablet view for precise selection on a tangible physical surface. While slicing techniques and tablets-in-VR have been previously explored, in this paper, we evaluated the potential of this hybrid approach to improve accuracy in highly occluded selection tasks, comparing different multimodal interactions (e.g., Mid-air, Virtual Tablet and Real Tablet). Our results showed that our hybrid technique significantly improved overall accuracy of selection compared to Mid-air selection only, thanks to the added haptic feedback given by the physical tablet surface, rather than the added visualization given by the tablet view.
TESTING & TUNING OUR CORRECTION TECHNIQUESWe conducted paired comparisons with Bonferroni corrections to compare the Drift effect with both correction techniques (Derivative and Angular) in each configuration of α and ε (49 configurations in total). Table S1 shows the mean and p values of this comparison. Results suggest a poor performance of the Derivative correction when compared with the uncorrected Drift. Averages with SD of all the configuration are shown in Figure S1. α ε Uncorrected Derivative Angular Uncorrected Vs Derivative Uncorrected Vs Angular Derivative Vs Angular 0.125 0.35 0.566 m 0.665 m 0.374 m p=0.037 p<0.001 p<0.
3D selection in dense VR environments (e.g., point clouds) is extremely challenging due to occlusion and imprecise mid-air input modalities (e.g., 3D controllers and hand gestures). In this paper, we propose "Slicing-Volume", a hybrid selection technique that enables simultaneous 3D interaction in mid-air, and a 2D penand-tablet metaphor in VR. Inspired by well-known slicing plane techniques in data visualization, our technique consists of a 3D volume that encloses target objects in mid-air, which are then projected to a 2D tablet view for precise selection on a tangible physical surface. While slicing techniques and tablets-in-VR have been previously explored, in this paper, we evaluated the potential of this hybrid approach to improve accuracy in highly occluded selection tasks, comparing different multimodal interactions (e.g., Mid-air, Virtual Tablet and Real Tablet). Our results showed that our hybrid technique significantly improved overall accuracy of selection compared to Mid-air selection only, thanks to the added haptic feedback given by the physical tablet surface, rather than the added visualization given by the tablet view.
Commercial Virtual Reality (VR) controllers with realistic force feedback are becoming available, to increase the realism and immersion of first-person shooting (FPS) games in VR. These controllers attempt to mimic not only the shape and weight of real guns but also their recoil effects (linear force feedback parallel to the barrel, when the gun is shot). As these controllers become more popular and affordable, this paper investigates the actual effects that these properties (shape, weight, and especially directional force feedback) have on performance for general VR users (e.g. users with no marksmanship experience), drawing conclusions for both consumers and device manufacturers.We created a prototype replicating the properties exploited by commercial VR controllers (i.e. shape, weight and adjustable force feedback) and used it to assess the effect of these parameters in user performance, across a series of user studies. We first analysed the benefits on user performance of adding weight and shape vs a conventional controller (e.g. Vive controller). We then explore the implications of adding linear force feedback (LFF), as well as replicating the shape and weight. Our studies show negligible effects on the immediate shooting performance with some improvements in subjective appreciation, which are already present with low levels of LFF. While higher levels of LFF do not increase subjective appreciations any further, they lead users to reach their maximum distance skillset more quickly. This indicates that while adding low levels of LFF can be enough to influence user's immersion/engagement for gaming contexts, controllers with higher levels of LFF might be better suited for training environments and/or when dealing with particularly demanding aiming tasks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.