2015
DOI: 10.1109/jsen.2015.2432127
|View full text |Cite
|
Sign up to set email alerts
|

Novel Tactile-SIFT Descriptor for Object Shape Recognition

Abstract: Using a tactile array sensor to recognize an object often requires multiple touches at different positions. This process is prone to move or rotate the object, which inevitably increases difficulty in object recognition. To cope with the unknown object movement, this paper proposes a new tactile-SIFT descriptor to extract features in view of gradients in the tactile image to represent objects, to allow the features being invariant to object translation and rotation. The tactile-SIFT segments a tactile image in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
80
0
1

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 101 publications
(83 citation statements)
references
References 31 publications
0
80
0
1
Order By: Relevance
“…Using a high resolution tactile imager named GelSight, Li et al [16] employed binary descriptors to match features extracted from tactile readings and created tactile maps of objects. In our previous work [17], [18], a new Tactile-SIFT descriptor was proposed based on the Scale Invariant Feature Transform (SIFT) descriptor [19]. In addition to the above hand-drafted features, in [20] unsupervised hierarchical feature learning was applied to extract features from raw tactile data for grasping and object recognition tasks.…”
Section: B Tactile Patterns Based Recognitionmentioning
confidence: 99%
See 3 more Smart Citations
“…Using a high resolution tactile imager named GelSight, Li et al [16] employed binary descriptors to match features extracted from tactile readings and created tactile maps of objects. In our previous work [17], [18], a new Tactile-SIFT descriptor was proposed based on the Scale Invariant Feature Transform (SIFT) descriptor [19]. In addition to the above hand-drafted features, in [20] unsupervised hierarchical feature learning was applied to extract features from raw tactile data for grasping and object recognition tasks.…”
Section: B Tactile Patterns Based Recognitionmentioning
confidence: 99%
“…As the sensor interacts with objects, the foam gets compressed and the force is transferred to the sensor; thus the pressure values change. The maximum scanning rate of the sensor is 270 frames/s but a rate of 5 frames/s was used in our experiments because in initial studies the chosen sampling rate was found to be sufficient for the tasks [3], [17]. …”
Section: A Wts Tactile Sensormentioning
confidence: 99%
See 2 more Smart Citations
“…Three kinds of representative objects which are usually used for tactile recognition research. Left: rigid objects used in the study by Luo et al 14 ; middle: textured material used in the study by Sinapov et al 15 ; right: deformable objects used in the study by Drimus et al 16 The figures in left and middle panels have been reproduced with permission from IEEE. The figure in right panel has been reproduced with permission from Elsevier.…”
Section: Introductionmentioning
confidence: 99%