2012 IEEE International Conference on Robotics and Automation 2012
DOI: 10.1109/icra.2012.6225371
|View full text |Cite
|
Sign up to set email alerts
|

Voting-based pose estimation for robotic assembly using a 3D sensor

Abstract: We propose a voting-based pose estimation algorithm applicable to 3D sensors, which are fast replacing their 2D counterparts in many robotics, computer vision, and gaming applications. It was recently shown that a pair of oriented 3D points, which are points on the object surface with normals, in a voting framework enables fast and robust pose estimation. Although oriented surface points are discriminative for objects with sufficient curvature changes, they are not compact and discriminative enough for many in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
80
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 154 publications
(83 citation statements)
references
References 27 publications
0
80
0
Order By: Relevance
“…Choi et al [6] extended it using points on object boundaries and defined surface-to-boundary (S2B) and boundary-toboundary (B2B) pair features. They showed that for planar objects, pair features including boundary points encode more information for pose estimation and provide better performance.…”
Section: Contributionsmentioning
confidence: 99%
See 4 more Smart Citations
“…Choi et al [6] extended it using points on object boundaries and defined surface-to-boundary (S2B) and boundary-toboundary (B2B) pair features. They showed that for planar objects, pair features including boundary points encode more information for pose estimation and provide better performance.…”
Section: Contributionsmentioning
confidence: 99%
“…Learning vs. uniform voting: We compared learning results with the baseline algorithm-uniform (equal) weights for each pair feature [7,6]. The results were computed for both synthetic and real data and are shown in Table 1 and Table 2, respectively.…”
Section: Industrial Objectsmentioning
confidence: 99%
See 3 more Smart Citations