Fig. 1. The objective of this paper is to evaluate various virtual locomotion conditions by comparing reference with virtual trajectories formed during goal-directed locomotion tasks. Reference trajectories (left) can be recorded through motion capture or be generated through a numerical model of human locomotion. The paper demonstrates the framework (center) over a set of experimental trajectories (right). For the purpose of demonstration, this paper compares frequently used virtual locomotion conditions. Abstract-Virtual walking, a fundamental task in Virtual Reality (VR), is greatly influenced by the locomotion interface being used, by the specificities of input and output devices, and by the way the virtual environment is represented. No matter how virtual walking is controlled, the generation of realistic virtual trajectories is absolutely required for some applications, especially those dedicated to the study of walking behaviors in VR, navigation through virtual places for architecture, rehabilitation and training. Previous studies focused on evaluating the realism of locomotion trajectories have mostly considered the result of the locomotion task (efficiency, accuracy) and its subjective perception (presence, cybersickness). Few focused on the locomotion trajectory itself, but in situation of geometrically constrained task. In this paper, we study the realism of unconstrained trajectories produced during virtual walking by addressing the following question: did the user reach his destination by virtually walking along a trajectory he would have followed in similar real conditions? To this end, we propose a comprehensive evaluation framework consisting on a set of trajectographical criteria and a locomotion model to generate reference trajectories. We consider a simple locomotion task where users walk between two oriented points in space. The travel path is analyzed both geometrically and temporally in comparison to simulated reference trajectories. In addition, we demonstrate the framework over a user study which considered an initial set of common and frequent virtual walking conditions, namely different input devices, output display devices, control laws, and visualization modalities. The study provides insight into the relative contributions of each condition to the overall realism of the resulting virtual trajectories.
In this paper, we propose a novel interface called Joyman, designed for immersive locomotion in virtual environments. Whereas many previous interfaces preserve or stimulate the users proprioception, the Joyman aims at preserving equilibrioception in order to improve the feeling of immersion during virtual locomotion tasks. The proposed interface is based on the metaphor of a human-scale joystick. The device has a simple mechanical design that allows a user to indicate his virtual navigation intentions by leaning accordingly. We also propose a control law inspired by the biomechanics of the human locomotion to transform the measured leaning angle into a walking direction and speed -i.e., a virtual velocity vector. A preliminary evaluation was conducted in order to evaluate the advantages and drawbacks of the proposed interface and to better draw the future expectations of such a device.
To cite this version:Pierre Chatelain, Alexandre Krupa, Maud Marchal. Real-time needle detection and tracking using a visually servoed 3D ultrasound probe. IEEE Int. Conf. on Robotics and Automation, ICRA'13, May 2013, Karlsruhe, Germany. pp.1668-1673, 2013 Real-time needle detection and tracking using a visually servoed 3D ultrasound probePierre Chatelain, Alexandre Krupa and Maud MarchalAbstract-In this paper, we present a method to localize and track manually inserted needles in real-time using a threedimensional ultrasound probe mounted on a robotized arm. The system tracks the needle using online image processing. We first propose a new algorithm capable of robustly detecting a needle from the moment it is inserted, without any a priori information on the insertion direction. By combining the random sample consensus (RANSAC) algorithm with Kalman filtering in closed loop, we achieve robust real-time tracking of the needle. In addition, we propose a control scheme to automatically guide the ultrasonic probe in order to keep the needle within the field of view, while aligning its axis with the ultrasound beam. This method will ease the insertion of the needle by the operator, and allow the development of autonomous needle insertion by medical robots.
Liver motion estimation and prediction during free-breathing from 2D ultrasound images can substantially reduce the in-plane motion uncertainty and hence treatment margins. Employing an accurate tracking method while avoiding non-linear temporal prediction would be favorable. This approach has the potential to shorten treatment time compared to breath-hold and gated approaches, and increase treatment efficiency and safety.
The original publication is available at www.springerlink.comInternational audienceCoil embolization offers a new approach to treat aneurysms. This medical procedure is namely less invasive than an open-surgery as it relies on the deployment of very thin platinum-based wires with the aneurysm through the arteries. When performed intracranially, this procedure must be particularly accurate and therefore carefully planned and performed by experienced radiologists. A simulator of the coil deployment represents an interesting and helpful tool for the physician by providing information on the coil behavior. In this paper, an original modeling is proposed to obtain interactive and accurate simulations of coil deployment. The model takes into account geometric nonlinearities and uses a shape memory formulation to describe its complex geometry. An experimental validation is performed in a contact-free environment to identify the mechanical properties of the coil and to quantitatively compare the simulation with the real data. Computational performances are also measured to insure an interactive simulation
Abstract. This paper presents a new modeling method for the insertion of needles and more generally thin and flexible medical devices into soft tissues. Several medical procedures rely on the insertion of slender medical devices such as biopsy, brachytherapy, deep-brain stimulation. In this paper, the interactions between soft tissues and flexible instruments are reproduced using a set of dedicated complementarity constraints. Each constraint is positionned and applied to the deformable models without requiring any remeshing. Our method allows for the 3D simulation of different physical phenomena such as puncture, cutting, static and dynamic friction at interactive frame rate. To obtain realistic simulation, the model can be parametrized using experimental data. Our method is validated through a series of typical simulation examples and new more complex scenarios.
The fingertips are one of the most important and sensitive parts of our body. They are the first stimulated areas of the hand when we interact with our environment. Providing haptic feedback to the fingertips in virtual reality could, thus, drastically improve perception and interaction with virtual environments. In this paper, we present a modular approach called HapTip to display such haptic sensations at the level of the fingertips. This approach relies on a wearable and compact haptic device able to simulate 2 Degree of Freedom (DoF) shear forces on the fingertip with a displacement range of ±2 mm. Several modules can be added and used jointly in order to address multi-finger and/or bimanual scenarios in virtual environments. For that purpose, we introduce several haptic rendering techniques to cover different cases of 3D interaction, such as touching a rough virtual surface, or feeling the inertia or weight of a virtual object. In order to illustrate the possibilities offered by HapTip, we provide four use cases focused on touching or grasping virtual objects. To validate the efficiency of our approach, we also conducted experiments to assess the tactile perception obtained with HapTip. Our results show that participants can successfully discriminate the directions of the 2 DoF stimulation of our haptic device. We found also that participants could well perceive different weights of virtual objects simulated using two HapTip devices. We believe that HapTip could be used in numerous applications in virtual reality for which 3D manipulation and tactile sensations are often crucial, such as in virtual prototyping or virtual training.
In this paper we present novel sensory feedbacks named "KingKong Effects" to enhance the sensation of walking in virtual environments. King Kong Effects are inspired by special effects in movies in which the incoming of a gigantic creature is suggested by adding visual vibrations/pulses to the camera at each of its steps. In this paper, we propose to add artificial visual or tactile vibrations (King-Kong Effects or KKE) at each footstep detected (or simulated) during the virtual walk of the user. The user can be seated, and our system proposes to use vibrotactile tiles located under his/her feet for tactile rendering, in addition to the visual display. We have designed different kinds of KKE based on vertical or lateral oscillations, physical or metaphorical patterns, and one or two peaks for heal-toe contacts simulation. We have conducted different experiments to evaluate the preferences of users navigating with or without the various KKE. Taken together, our results identify the best choices for future uses of visual and tactile KKE, and they suggest a preference for multisensory combinations. Our King-Kong effects could be used in a variety of VR applications targeting the immersion of a user walking in a 3D virtual scene.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.