2019
DOI: 10.1109/lra.2019.2899192
|View full text |Cite
|
Sign up to set email alerts
|

From Pixels to Percepts: Highly Robust Edge Perception and Contour Following Using Deep Learning and an Optical Biomimetic Tactile Sensor

Abstract: Deep learning has the potential to have the impact on robot touch that it has had on robot vision. Optical tactile sensors act as a bridge between the subjects by allowing techniques from vision to be applied to touch. In this paper, we apply deep learning to an optical biomimetic tactile sensor, the TacTip, which images an array of papillae (pins) inside its sensing surface analogous to structures within human skin. Our main result is that the application of a deep CNN can give reliable edge perception and th… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
85
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 106 publications
(85 citation statements)
references
References 38 publications
0
85
0
Order By: Relevance
“…A dotted pattern is printed on the inside of the touchpad surface. Recently, it has become possible for soft materials to be processed by 3D printers, and the elastic part with markers can be directly fabricated through 3D printing [27,35,67,68,69,70,71,72].…”
Section: Physical Contact To Light Conversionmentioning
confidence: 99%
“…A dotted pattern is printed on the inside of the touchpad surface. Recently, it has become possible for soft materials to be processed by 3D printers, and the elastic part with markers can be directly fabricated through 3D printing [27,35,67,68,69,70,71,72].…”
Section: Physical Contact To Light Conversionmentioning
confidence: 99%
“…Recent research has shown that this second technique gives improved results in a contour following tasks, particularly when concerned with robustness in an online setting. 40 Thus, in this article, the main approach will be to use raw images and neural networks; the hand has, however, been designed to accommodate both approaches.…”
Section: Tactile Datamentioning
confidence: 99%
“…To estimate the appropriate frame, we calculate the absolute pixel difference for each frame in a tactile video when compared with the first frame in the same video. 40 This gives a basic measure of sensor deformity.…”
Section: Tactile Datamentioning
confidence: 99%
“…Following recent work with this tactile sensor [22], the preprocessed image captured by the sensor is directly passed into the machine learning algorithms. This removes the need for pin detection and tracking algorithms that were necessary in previous work, and enables the advantages of convolutional neural networks to be applied to tactile data.…”
Section: Hardware a Custom Biomimetic Tactile Fingertipmentioning
confidence: 99%