The fingertips are one of the most important and sensitive parts of our body. They are the first stimulated areas of the hand when we interact with our environment. Providing haptic feedback to the fingertips in virtual reality could, thus, drastically improve perception and interaction with virtual environments. In this paper, we present a modular approach called HapTip to display such haptic sensations at the level of the fingertips. This approach relies on a wearable and compact haptic device able to simulate 2 Degree of Freedom (DoF) shear forces on the fingertip with a displacement range of ±2 mm. Several modules can be added and used jointly in order to address multi-finger and/or bimanual scenarios in virtual environments. For that purpose, we introduce several haptic rendering techniques to cover different cases of 3D interaction, such as touching a rough virtual surface, or feeling the inertia or weight of a virtual object. In order to illustrate the possibilities offered by HapTip, we provide four use cases focused on touching or grasping virtual objects. To validate the efficiency of our approach, we also conducted experiments to assess the tactile perception obtained with HapTip. Our results show that participants can successfully discriminate the directions of the 2 DoF stimulation of our haptic device. We found also that participants could well perceive different weights of virtual objects simulated using two HapTip devices. We believe that HapTip could be used in numerous applications in virtual reality for which 3D manipulation and tactile sensations are often crucial, such as in virtual prototyping or virtual training.
Good performances in tele-surgery procedures is achieved when the surgeon acts and feels as if he were holding directly in his hands the surgical instruments interacting with the patient. To reach this goal, a high fidelity haptic device was recently developed at CEA LIST. As the development of such an input device calls for a precise understanding of the application requirements, we first introduce in this paper Minimally Invasive Surgery problematics and associated design guidelines. Then we focus on dimensioning and optimisation of input device performances allowing high quality remote handling. Finally, we present master arm first and second generation prototypes and their performances.
In this article, we present a new wearable haptic interface developed at CEA-LIST for precise finger interactions within virtual reality applications in large environments. The hand movements are tracked using a stereoscopic visual tracking system, allowing large movements in free space. Moreover, the device integrates two three degrees of freedom with force feedback robots associated with index and thumb fingers, allowing virtual objects fine manipulation. Finally, a two degrees of freedom tactile actuator is integrated under the pulp of each fingertip in order to improve the high frequency response of the haptic interface and to provide information on the texture and the shape of the virtual objects manipulated.
Exoskeletons are progressively reaching homes and workplaces, allowing interaction with virtual environments, remote control of robots, or assisting human operators in carrying heavy loads. Their design is however still a challenge as these robots, being mechanically linked to the operators who wear them, have to meet ergonomic constraints besides usual robotic requirements in terms of workspace, speed, or efforts. They have in particular to fit the anthropometry and mobility of their users. This traditionally results in numerous prototypes which are progressively fitted to each individual person. In this paper, we propose instead to validate the design of a hand exoskeleton in a fully digital environment, without the need for a physical prototype. The purpose of this study is thus to examine whether finger kinematics are altered when using a given hand exoskeleton. Therefore, user specific musculoskeletal models were created and driven by a motion capture system to evaluate the fingers' joint kinematics when performing two industrial related tasks. The kinematic chain of the exoskeleton was added to the musculoskeletal models and its compliance with the hand movements was evaluated. Our results show that the proposed exoskeleton design does not influence fingers' joints angles, the coefficient of determination between the model with and without exoskeleton being consistently high (R¯=0.93) and the nRMSE consistently low (nRMSE¯ = 5.42°). These results are promising and this approach combining musculoskeletal and robotic modeling driven by motion capture data could be a key factor in the ergonomics validation of the design of orthotic devices and exoskeletons prior to manufacturing.
Manual human-computer interfaces for virtual reality are designed to allow an operator interacting with a computer simulation as naturally as possible. Dexterous haptic interfaces are the best suited for this goal. They give intuitive and efficient control on the environment with haptic and tactile feedback. This paper is aimed at helping in the choice of the interaction areas to be taken into account in the design of such interfaces. The literature dealing with hand interactions is first reviewed in order to point out the contact areas involved in exploration and manipulation tasks. Their frequencies of use are then extracted from existing recordings. The results are gathered in an original graphical interaction map allowing for a simple visualization of the way the hand is used, and compared with a map of mechanoreceptors densities. Then an interaction tree, mapping the relative amount of actions made available through the use of a given contact area, is built and correlated with the losses of hand function induced by amputations. A rating of some existing haptic interfaces and guidelines for their design are finally achieved to illustrate a possible use of the developed graphical tools.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.