2021
DOI: 10.3390/e23111537
|View full text |Cite
|
Sign up to set email alerts
|

Target Classification Method of Tactile Perception Data with Deep Learning

Abstract: In order to improve the accuracy of manipulator operation, it is necessary to install a tactile sensor on the manipulator to obtain tactile information and accurately classify a target. However, with the increase in the uncertainty and complexity of tactile sensing data characteristics, and the continuous development of tactile sensors, typical machine-learning algorithms often cannot solve the problem of target classification of pure tactile data. Here, we propose a new model by combining a convolutional neur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 33 publications
0
6
0
Order By: Relevance
“…As shown in Table 4 , the top 1 score of the proposed method surpasses the Sundaram et al ( 2019 ), Wang et al ( 2021 ), Zhang et al ( 2021 ), and Sharma ( 2022 ) by 16.43, 16.81, 8.72, and 6.99%, which indicates that our method has the highest classification accuracy on the MIT-STAG dataset. Meanwhile, the proposed method of KC surpasses the Sundaram et al ( 2019 ), Wang et al ( 2021 ), Zhang et al ( 2021 ), and Sharma ( 2022 ) by 17.18, 17.57, 9.22, and 7.4%, which indicates that our method has the lowest degree of confusion which can also be seen in Figure 4 .…”
Section: Methodsmentioning
confidence: 71%
See 1 more Smart Citation
“…As shown in Table 4 , the top 1 score of the proposed method surpasses the Sundaram et al ( 2019 ), Wang et al ( 2021 ), Zhang et al ( 2021 ), and Sharma ( 2022 ) by 16.43, 16.81, 8.72, and 6.99%, which indicates that our method has the highest classification accuracy on the MIT-STAG dataset. Meanwhile, the proposed method of KC surpasses the Sundaram et al ( 2019 ), Wang et al ( 2021 ), Zhang et al ( 2021 ), and Sharma ( 2022 ) by 17.18, 17.57, 9.22, and 7.4%, which indicates that our method has the lowest degree of confusion which can also be seen in Figure 4 .…”
Section: Methodsmentioning
confidence: 71%
“… Comparisons with state-of-the-art models in terms of confusion matrix on the MIT-STAG dataset. (A) Sundaram et al, 2019 , (B) Wang et al, 2021 , (C) Zhang et al, 2021 , (D) Sharma, 2022 , (E) Ours. …”
Section: Methodsmentioning
confidence: 93%
“…Zhang et al [27] introduced an innovative approach in the collection of tactile data sets by utilizing multiple sensors. This advancement enabled the acquisition of more intricate tactile data.…”
Section: Cross Modal Tactile Datasetsmentioning
confidence: 99%
“…The contribution of Zhang et al lies in two key aspects [27]. Firstly, they optimized a ResNet10-v1 architecture that combines a convolutional neural network (CNN) [35] and a residual network (ResNet) [36] to extract features from tactile information.…”
Section: Deep Learningmentioning
confidence: 99%
“…Here, we further develop our initial inquiry as to what type of algorithm the biological nervous system might employ to perform the classification of tactile inputs, with an eye towards extending such algorithms to artificial agents (Bandyopadhyaya et al 2014, Taunyazov et al 2020, Zhang et al 2021. We examine the brain algorithms that might be at work when a trained animal, a rat in our case, classifies a tactile vibration.…”
Section: Introductionmentioning
confidence: 99%