Proceedings of International Conference on Robotics and Automation
DOI: 10.1109/robot.1997.606861
|View full text |Cite
|
Sign up to set email alerts
|

Pose-independent recognition of convex objects from sparse tactile data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 12 publications
0
8
0
Order By: Relevance
“…Shape recognition has long been based on the notion of "interpretation tree," which represents correspondences between features extracted from the tactile data and those on the model [17], [21], [22]. A volumetric approximation [3] can be built over tactile data to enhance feature selection and prune incompatible models.…”
Section: B Tactile Shape Recognitionmentioning
confidence: 99%
See 3 more Smart Citations
“…Shape recognition has long been based on the notion of "interpretation tree," which represents correspondences between features extracted from the tactile data and those on the model [17], [21], [22]. A volumetric approximation [3] can be built over tactile data to enhance feature selection and prune incompatible models.…”
Section: B Tactile Shape Recognitionmentioning
confidence: 99%
“…In [3], internal and external volumetric approximations of a convex object were built over sparse tactile data. The main disadvantage of this method was its applicability-limited to convex objects only.…”
Section: B Tactile Shape Recognitionmentioning
confidence: 99%
See 2 more Smart Citations
“…For instance, in [13] a method using a network of triangular B-spline patches based on arbitrary topological domain maintaining tangent plane continuity is used. Techniques for recognizing convex polyhedra from sparse tactile data, using volumetric representations, are used in [14]. The performance of an hybrid force/velocity controller for automatic edge following is analyzed in [15].…”
Section: Introductionmentioning
confidence: 99%