2020
DOI: 10.1111/cgf.14154
|View full text |Cite
|
Sign up to set email alerts
|

An Occlusion‐aware Edge‐Based Method for Monocular 3D Object Tracking using Edge Confidence

Abstract: We propose an edge-based method for 6DOF pose tracking of rigid objects using a monocular RGB camera. One of the critical problem for edge-based methods is to search the object contour points in the image corresponding to the known 3D model points. However, previous methods often produce false object contour points in case of cluttered backgrounds and partial occlusions. In this paper, we propose a novel edge-based 3D objects tracking method to tackle this problem. To search the object contour points, foregrou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 36 publications
0
13
0
Order By: Relevance
“…Second, our method takes robust estimation with α < 2 to handle erroneous correspondences. In previous methods α = 2, so the optimization process is sensitive to the correspondence errors, and complex filtering and weighting techniques thus are necessary [8,17]. We will show that, by setting α as a small value, erroneous correspondences can be well suppressed with a simple weighting function ω i (see Section 3.3).…”
Section: Robust Contour-based Trackingmentioning
confidence: 88%
See 1 more Smart Citation
“…Second, our method takes robust estimation with α < 2 to handle erroneous correspondences. In previous methods α = 2, so the optimization process is sensitive to the correspondence errors, and complex filtering and weighting techniques thus are necessary [8,17]. We will show that, by setting α as a small value, erroneous correspondences can be well suppressed with a simple weighting function ω i (see Section 3.3).…”
Section: Robust Contour-based Trackingmentioning
confidence: 88%
“…However, early contour-based methods are known to be sensitive to background clutters that may cause wrong contour correspondences. To solve this problem, local color information is leveraged for improving the correspondences [8,17], which effectively improves accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…Edge‐based methods [HS90, DC02, SPP*14, IP15, WZQ17, CRV*18, WZQ19, HZSQ20] generally begin by detecting the object's edges and then matching them with the contours projected by the 3D model to optimize the object's pose. However, these methods can be easily disrupted by chaotic backgrounds.…”
Section: Related Workmentioning
confidence: 99%
“…Key-point features such as SIFT (Lowe 2004), ORB (Rublee et al 2011), or BRISK (Leutenegger et al 2011) have been widely used for 3D object tracking (Wagner et al 2010;Vacchetti et al 2004), with more recent developments like LIFT (Yi et al 2016) and SuperGlue (Sarlin et al 2020) introducing deep learning at various stages. Explicit edges provide an additional source of information that is used by many approaches (Huang et al 2020;Bugaev et al 2018;Seo et al 2014;Comport et al 2006;Drummond and Cipolla 2002;Harris and Stennett 1990). Also, direct methods (Engel et al 2018;Seo and Wuest 2016;Crivellaro and Lepetit 2014), which optimize a photometric error and can be traced back to Lucas and Kanade (1981), have been proposed.…”
Section: D Object Trackingmentioning
confidence: 99%
“…Results of the evaluation are shown in Table 1. Our approach is compared to the current state of the art in regionbased tracking, as well as the edge-based methods of Huang et al (2020), algorithms of Li et al (2021) and Sun et al (2021) that combine edge and region information, and the method of Liu et al (2021) that uses descriptor fields in addition to region-based techniques. The comparison shows that SRT3D performs significantly better than previous methods, achieving superior results for most objects and performing best on average.…”
Section: Rbot Datasetmentioning
confidence: 99%