h i g h l i g h t s• A flexible and stretchable durable fabric-based tactile sensor capable of capturing typical human interaction forces was developed.• We present elaborate measurement results of the sensor. • A process of creating multiple sensor areas in a single fabric patch was developed.• The measures against performance degradation due to moisture are presented. • Using the developed technology, a tactile dataglove with 54 pressure sensitive regions was built.a r t i c l e i n f o Article history: Available online xxxx Keywords: Tactile sensor Flexible tactile sensor Stretchable tactile sensor Tactile dataglove a b s t r a c tWe introduce a novel, fabric-based, flexible, and stretchable tactile sensor, which is capable of seamlessly covering natural shapes. As humans and robots have curved body parts that move with respect to each other, the practical usage of traditional rigid tactile sensor arrays is limited. Rather, a flexible tactile skin is required. Our design allows for several tactile cells to be embedded in a single sensor patch. It can have an arbitrary perimeter and can cover free-form surfaces. In this article we discuss the construction of the sensor and evaluate its performance. Our flexible tactile sensor remains operational on top of soft padding such as a gel cushion, enabling the construction of a human-like soft tactile skin. The sensor allows pressure measurements to be read from a subtle less than 1 kPa up to high pressures of more than 500 kPa, which easily covers the common range for everyday human manual interactions. Due to a layered construction, the sensor is very robust and can withstand normal forces multiple magnitudes higher than what could be achieved by a human without sustaining damage.As an exciting application for the sensor, we describe the construction of a wearable tactile dataglove with 54 tactile cells and embedded data acquisition electronics. We also discuss the necessary implementation details to maintain long term sensor performance in the presence of moisture.
Abstract:We present an integrated sensing glove that combines two of the most visionary wearable sensing technologies to provide both hand posture sensing and tactile pressure sensing in a unique, lightweight, and stretchable device. Namely, hand posture reconstruction employs Knitted Piezoresistive Fabrics that allows us to measure bending. From only five of these sensors (one for each finger) the full hand pose of a 19 degrees of freedom (DOF) hand model is reconstructed leveraging optimal sensor placement and estimation techniques. To this end, we exploit a-priori information of synergistic coordination patterns in grasping tasks. Tactile sensing employs a piezoresistive fabric allowing us to measure normal forces in more than 50 taxels spread over the palmar surface of the glove. We describe both sensing technologies, report on the software integration of both modalities, and describe a preliminary evaluation experiment analyzing hand postures and force patterns during grasping. Results of the reconstruction are promising and encourage us to push further our approach with potential applications in neuroscience, virtual reality, robotics and tele-operation.
We present a novel, soft, tactile skin composed of a fabric-based, stretchable sensor technology based on the piezoresistive effect. Softness is achieved by a combination of a soft silicone padding covered by a skin of more durable, tearproof silicone with an imprinted surface pattern mimicking human glabrous skin, found e.g. in fingertips. Its very thin layer structure (starting from 2.5 mm) facilitates integration on existing robot surfaces, particularly on small and highly curved links. For example, we augmented our Shadow Dexterous Hand with 12 palm sensors, and 2 resp. 3 sensors in the middle resp. proximal phalanges of each finger. To demonstrate the usefulness and efficiency of the proposed sensor skin, we performed a challenging classification task distinguishing squeezed objects based on their varying stiffness.
No abstract
The ability to discriminate between target and distractors, using the information perceived by the hand over time, is essential to perform haptic search successfully, be it for a human hand or a suitably sensorized anthropomorphic robot hand. To address the latter, we train a binary classifier to perform this discrimination during unconstrained haptic search performed by sighted study participants who were blindfolded. In this work, we test different representational concepts and compare the results with the human classification performance. This approach both guides our understanding of human haptic interaction with the 3D environment and aids future modeling of artificial touch for anthropomorphic robot hands. Our contribution is threefold. Firstly, we are able to acquire a synchronized multimodal time series of exceptionally high spatio-temporal resolution of both the 3D environment and the hand with our novel experimental setup. It includes our Modular Haptic Stimulus Board to represent a 3D environment and a novel tactile glove equipped with position tracking markers and joint angle sensors. Secondly, we introduce a machine learning approach inspired by a novel application of the feature guidance concept for vision (Wolfe et al., 2007 [1]) to modeling of haptic search in a 3D environment, focusing on the target-distractor discrimination. Finally, we compare results for two different types of artificial neural networks, a feedforward and a recurrent network. We show that using recurrent networks, and therefore integrating information over time, improves the classification results. The evaluation also shows that classification accuracy is the highest for the combination of both the tactile and the joint angle modalities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.