In the last decade, we have witnessed a drastic change in the form factor of audio and vision technologies, from heavy and grounded machines to lightweight devices that naturally fit our bodies. However, only recently, haptic systems have started to be designed with wearability in mind. The wearability of haptic systems enables novel forms of communication, cooperation, and integration between humans and machines. Wearable haptic interfaces are capable of communicating with the human wearers during their interaction with the environment they share, in a natural and yet private way. This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges. The paper also discusses the main technological and design challenges for the development of wearable haptic interfaces, and it reports on the future perspectives of the field. Finally, the paper includes two tables summarizing the characteristics and features of the most representative wearable haptic systems for the fingertip and the hand.
Wearability will significantly increase the use of haptics in everyday life, as has already happened for audio and video technologies. The literature on wearable haptics is mainly focused on vibrotactile stimulation, and only recently, wearable devices conveying richer stimuli, like force vectors, have been proposed. This paper introduces design guidelines for wearable haptics and presents a novel 3-DoF wearable haptic interface able to apply force vectors directly to the fingertip. It consists of two platforms: a static one, placed on the back of the finger, and a mobile one, responsible for applying forces at the finger pad. The structure of the device resembles that of parallel robots, where the fingertip is placed in between the static and the moving platforms. This work presents the design of the wearable display, along with the quasi-static modeling of the relationship between the applied forces and the platform's orientation and displacement. The device can exert up to 1.5 N, with a maximum platform inclination of 30 degree. To validate the device and verify its effectiveness, a curvature discrimination experiment was carried out: employing the wearable device together with a popular haptic interface improved the performance with respect of employing the haptic interface alone.
Abstract-A novel sensory substitution technique is presented. Kinesthetic and cutaneous force feedback are substituted by cutaneous feedback (CF) only, provided by two wearable devices able to apply forces to the index finger and the thumb, while holding a handle during a teleoperation task. The force pattern, fed back to the user while using the cutaneous devices, is similar, in terms of intensity and area of application, to the cutaneous force pattern applied to the finger pad while interacting with a haptic device providing both cutaneous and kinesthetic force feedback. The pattern generated using the cutaneous devices can be thought as a subtraction between the complete haptic feedback (HF) and the kinesthetic part of it. For this reason, we refer to this approach as sensory subtraction instead of sensory substitution. A needle insertion scenario is considered to validate the approach. The haptic device is connected to a virtual environment simulating a needle insertion task. Experiments show that the perception of inserting a needle using the cutaneous-only force feedback is nearly indistinguishable from the one felt by the user while using both cutaneous and kinesthetic feedback. As most of the sensory substitution approaches, the proposed sensory subtraction technique also has the advantage of not suffering from stability issues of teleoperation systems due, for instance, to communication delays. Moreover, experiments show that the sensory subtraction technique outperforms sensory substitution with more conventional visual feedback (VF).
Despite its expected clinical benefits, current teleoperated surgical robots do not provide the surgeon with haptic feedback largely because grounded forces can destabilize the system's closed-loop controller. This paper presents an alternative approach that enables the surgeon to feel fingertip contact deformations and vibrations while guaranteeing the teleoperator's stability. We implemented our cutaneous feedback solution on an Intuitive Surgical da Vinci Standard robot by mounting a SynTouch BioTac tactile sensor to the distal end of a surgical instrument and a custom cutaneous display to the corresponding master controller. As the user probes the remote environment, the contact deformations, dc pressure, and ac pressure (vibrations) sensed by the BioTac are directly mapped to input commands for the cutaneous device's motors using a model-free algorithm based on look-up tables. The cutaneous display continually moves, tilts, and vibrates a flat plate at the operator's fingertip to optimally reproduce the tactile sensations experienced by the BioTac. We tested the proposed approach by having eighteen subjects use the augmented da Vinci robot to palpate a heart model with no haptic feedback, only deformation feedback, and deformation plus vibration feedback. Fingertip deformation feedback significantly improved palpation performance by reducing the task completion time, the pressure exerted on the heart model, and the subject's absolute error in detecting the orientation of the embedded plastic stick. Vibration feedback significantly improved palpation performance only for the seven subjects who dragged the BioTac across the model, rather than pressing straight into it.
Cutaneous haptic feedback can be used to enhance the performance of robotic teleoperation systems while guaranteeing their safety. Delivering ungrounded cutaneous cues to the human operator conveys in fact information about the forces exerted at the slave side and does not affect the stability of the control loop. In this work we analyze the feasibility, effectiveness, and implications of providing solely cutaneous feedback in robotic teleoperation. We carried out two peg-in-hole experiments, both in a virtual environment and in a real (teleoperated) environment. Two novel 3-degree-of-freedom fingertip cutaneous displays deliver a suitable amount of cutaneous feedback at the thumb and index fingers. Results assessed the feasibility and effectiveness of the proposed approach. Cutaneous feedback was outperformed by full haptic feedback provided by grounded haptic interfaces, but it outperformed conditions providing no force feedback at all. Moreover, cutaneous feedback always kept the system stable, even in the presence of destabilizing factors such as communication delays and hard contacts.
Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.
A study on the role of cutaneous and kinesthetic force feedback in teleoperation is presented. Cutaneous cues provide less transparency than kinesthetic force, but they do not affect the stability of the teleoperation system. On the other hand, kinesthesia provides a compelling illusion of telepresence but affects the stability of the haptic loop. However, when employing common grounded haptic interfaces, it is not possible to independently control the cutaneous and kinesthetic components of the interaction. For this reason, many control techniques ensure a stable interaction by scaling down both kinesthetic and cutaneous force feedback, even though acting on the cutaneous channel is not necessary.\ud \ud We discuss here the feasibility of a novel approach. It aims at improving the realism of the haptic rendering, while preserving its stability, by modulating cutaneous force to compensate for a lack of kinesthesia. We carried out two teleoperation experiments, evaluating (1) the role of cutaneous stimuli when reducing kinesthesia and (2) the extent to which an overactuation of the cutaneous channel can fully compensate for a lack of kinesthetic force feedback. Results showed that, to some extent, it is possible to compensate for a lack of kinesthesia with the aforementioned technique, without significant performance degradation. Moreover, users showed a high comfort level in using the proposed system
Wearable technologies are gaining great popularity in the recent years. The demand for devices that are lightweight and compact challenges researchers to pursue innovative solutions to make existing technologies more portable and wearable. In this paper we present a novel wearable cutaneous fingertip device with 3 degrees of freedom. It is composed of two parallel platforms: the upper body is fixed on the back of the finger, housing three small servo motors, and the mobile end-effector is in contact with the volar surface of the fingertip. The two platforms are connected by three articulated legs, actuated by the motors in order to move the mobile platform toward the user's fingertip and re-angle it to simulate contacts with arbitrarily oriented surfaces. Each leg is composed of two rigid links, connected to each other and then to the platforms, according to a RRS (Revolute-Revolute-Spherical) kinematic chain. With respect to other similar cable-driven devices presented in the literature, this device solves the indeterminacy due to the underactuation of the platform. This work presents the main design steps for the development of the wearable display, along with its kinematics, quasi-static modeling, and control. In particular, we analyzed the relationship between device performance and its main geometrical parameters. A perceptual experiment shows that the cutaneous device is able to effectively render different platform configurations
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.