Humans rely on distributed tactile sensing in their hands to achieve robust and dexterous manipulation of delicate objects. Soft robotic hands have received increased attention in recent years due to their adaptability to unknown objects and safe interactions with the environment. However, the integration of distributed sensing in soft robotic hands is lacking. This is largely due to the complexity in the integration of soft sensing solutions with the hands. This paper proposes a novel soft robotic hand that incorporates an active palm and distributed pneumatic tactile sensing in both the fingers and the palm. Multi-material 3D printing allows the tactile sensors to be directly printed on the hand, whereas conventional tactile approaches require the sensors to be attached as part of multiple fabrication procedures. Active degrees of freedom are introduced in the palm to achieve increased dexterity. The proposed hand successfully performed 32 of the 33 Feix taxonomy grasps and all 11 Kapandji thumb opposition poses.
This article deals with the problem of the recognition of human hand touch by a robot equipped with large area tactile sensors covering its body. This problem is relevant in the domain of physical human–robot interaction for discriminating between human and non-human contacts and to trigger and to drive cooperative tasks or robot motions, or to ensure a safe interaction. The underlying assumption used in this article is that voluntary physical interaction tasks involve hand touch over the robot body, and therefore the capability to recognize hand contacts is a key element to discriminate a purposive human touch from other types of interaction. The proposed approach is based on a geometric transformation of the tactile data, formed by pressure measurements associated to a non-uniform cloud of 3D points ( taxels) spread over a non-linear manifold corresponding to the robot body, into tactile images representing the contact pressure distribution in two dimensions. Tactile images can be processed using deep learning algorithms to recognize human hands and to compute the pressure distribution applied by the various hand segments: palm and single fingers. Experimental results, performed on a real robot covered with robot skin, show the effectiveness of the proposed methodology. Moreover, to evaluate its robustness, various types of failures have been simulated. A further analysis concerning the transferability of the system has been performed, considering contacts occurring on a different sensorized robot part.
Tactile sensing endows the robots to perceive certain physical properties of the object in contact. Robots with tactile perception can classify textures by touching. Interestingly, textures of fine micro-geometry beyond the nominal resolution of the tactile sensors can also be identified through exploratory robotic movements like sliding. To study the problem of fine texture classification, we design a robotic sliding experiment using a finger-shaped multi-channel capacitive tactile sensor. A feature extraction process is presented to encode the acquired tactile signals (in the form of time series) into a low dimensional (≤7D) feature vector. The feature vector captures the frequency signature of a fabric texture such that fabrics can be classified directly. The experiment includes multiple combinations of sliding parameters, i.e., speed and pressure, to investigate the correlation between sliding parameters and the generated feature space. Results show that changing the contact pressure can greatly affect the significance of the extracted feature vectors. Instead, variation of sliding speed shows no apparent effects. In summary, this paper presents a study of texture classification on fabrics by training a simple k-NN classifier, using only one modality and one type of exploratory motion (sliding). The classification accuracy can reach up to 96%. The analysis of the feature space also implies a potential parametric representation of textures for tactile perception, which could be used for the adaption of motion to reach better classification performance.
A multi-material 3D printed soft actuator is presented that uses symmetrical, parallel chambers to achieve bi-directional variable stiffness. Many recent soft robotic solutions involve multi-stage fabrication, provide variable stiffness in only one direction or lack a means of reliably controlling the actuator stiffness. The use of multi-material 3D printing means complex monolithic designs can be produced without the need for further fabrication steps. We demonstrate that this allows for a high degree of repeatability between actuators and the ability to introduce different control behaviours into a single body. By independently varying the pressure in two parallel chambers, two control modes are proposed: complementary and antagonistic. We show that the actuator is able to tune its force output. The differential control significantly increases force output with controllable stiffness enabled within a safe, low-pressure range (≤ 20 kPa). Experimental characterisations in angular range, repeatability between printed models, hysteresis, absolute maximum force, and beam stiffness are presented. The proposed design demonstrated a maximum bending angle of 102.6 • , maximum output force 2.17N, and maximum beam stiffness 0.96mN m 2 .
Tactile sensing is a key enabling technology to develop complex behaviours for robots interacting with humans or the environment. This paper discusses computational aspects playing a significant role when extracting information about contact events. Considering a large-scale, capacitance-based robot skin technology we developed in the past few years, we analyse the classical Boussinesq–Cerruti’s solution and the Love’s approach for solving a distributed inverse contact problem, both from a qualitative and a computational perspective. Our contribution is the characterisation of the algorithms’ performance using a freely available dataset and data originating from surfaces provided with robot skin.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.