Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
The rapid advancement of soft robotic technology emphasizes the growing importance of tactile perception. Soft grippers, equipped with tactile sensing, can gather interactive information crucial for safe human–robot interaction, wearable devices, and dexterous manipulation. However, most soft grippers with tactile sensing abilities have limited modes of tactile perception, restricting their dexterity and safety. In addition, existing tactile systems are often complicated, leading to unstable perception signals. Inspired by various organisms, a novel multimodal tactile‐sensing soft robotic finger is proposed. This finger, based on a modified fin ray structure, integrates a distributed fiber optic sensing system as part of its tactile sensory neural system. It replicates human finger capabilities, discerning contact forces as low as 0.01 N with exceptional sensitivity (106.96 mN nm−1). Through training neural networks models, the finger achieves an accuracy exceeding 96% in recognizing roughness, material stiffness, and finger pad position. Assembled into two‐finger parallel gripper, it demonstrates precise manipulation capabilities for fragile items like strawberries and potato chips. Moreover, through synergistic interplay of multimodal tactile sensing, this finger can successfully grasp an underwater transparent sphere, mitigating limitations of visual perception. The developed soft finger holds promise in various scenarios including hazardous environment detection and specialized grasping tasks.
The rapid advancement of soft robotic technology emphasizes the growing importance of tactile perception. Soft grippers, equipped with tactile sensing, can gather interactive information crucial for safe human–robot interaction, wearable devices, and dexterous manipulation. However, most soft grippers with tactile sensing abilities have limited modes of tactile perception, restricting their dexterity and safety. In addition, existing tactile systems are often complicated, leading to unstable perception signals. Inspired by various organisms, a novel multimodal tactile‐sensing soft robotic finger is proposed. This finger, based on a modified fin ray structure, integrates a distributed fiber optic sensing system as part of its tactile sensory neural system. It replicates human finger capabilities, discerning contact forces as low as 0.01 N with exceptional sensitivity (106.96 mN nm−1). Through training neural networks models, the finger achieves an accuracy exceeding 96% in recognizing roughness, material stiffness, and finger pad position. Assembled into two‐finger parallel gripper, it demonstrates precise manipulation capabilities for fragile items like strawberries and potato chips. Moreover, through synergistic interplay of multimodal tactile sensing, this finger can successfully grasp an underwater transparent sphere, mitigating limitations of visual perception. The developed soft finger holds promise in various scenarios including hazardous environment detection and specialized grasping tasks.
This article provides a comprehensive analysis of the feature extraction methods applied to vibro-acoustic signals (VA signals) in the context of robot-assisted interventions. The primary objective is to extract valuable information from these signals to understand tissue behaviour better and build upon prior research. This study is divided into three key stages: feature extraction using the Cepstrum Transform (CT), Mel-Frequency Cepstral Coefficients (MFCCs), and Fast Chirplet Transform (FCT); dimensionality reduction employing techniques such as Principal Component Analysis (PCA), t-Distributed Stochastic Neighbour Embedding (t-SNE), and Uniform Manifold Approximation and Projection (UMAP); and, finally, classification using a nearest neighbours classifier. The results demonstrate that using feature extraction techniques, especially the combination of CT and MFCC with dimensionality reduction algorithms, yields highly efficient outcomes. The classification metrics (Accuracy, Recall, and F1-score) approach 99%, and the clustering metric is 0.61. The performance of the CT–UMAP combination stands out in the evaluation metrics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.