2015
DOI: 10.1007/978-3-319-16181-5_49
|View full text |Cite
|
Sign up to set email alerts
|

Relative Pose Estimation and Fusion of Omnidirectional and Lidar Cameras

Abstract: Abstract. This paper presents a novel approach for the extrinsic parameter estimation of omnidirectional cameras with respect to a 3D Lidar coordinate frame. The method works without specific setup and calibration targets, using only a pair of 2D-3D data. Pose estimation is formulated as a 2D-3D nonlinear shape registration task which is solved without point correspondences or complex similarity metrics. It relies on a set of corresponding regions, and pose parameters are obtained by solving a small system of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
25
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(25 citation statements)
references
References 28 publications
0
25
0
Order By: Relevance
“…Most state-of-the-art approaches handle the 3D-2D registration between a camera and a depth sensor by using special calibration targets [1], [2], [3]. Other semi-automatic methods extract human-selected 3D and 2D shapes from both sensors which are then aligned [12], [13]. The mentioned methods achieve excellent results and can therefore be used for a suitable initial calibration.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Most state-of-the-art approaches handle the 3D-2D registration between a camera and a depth sensor by using special calibration targets [1], [2], [3]. Other semi-automatic methods extract human-selected 3D and 2D shapes from both sensors which are then aligned [12], [13]. The mentioned methods achieve excellent results and can therefore be used for a suitable initial calibration.…”
Section: Related Workmentioning
confidence: 99%
“…The mentioned methods achieve excellent results and can therefore be used for a suitable initial calibration. However, they are either time consuming [12], [13], [2], [3]. or require a controlled environment [1].…”
Section: Related Workmentioning
confidence: 99%
“…Line features were introduced by L. Liu et al [4] to the registration of building data, as well. In yuanfang0524@gmail.com) addition, region features were segmented from 2D-3D urban sensing data and applied to registration by Levente Tamas et al [10]. In general, the extraction and matching of features are crucial to the success of feature-based 2D-3D registration methods.…”
Section: Introduction a Backgroundmentioning
confidence: 99%
“…Furthermore, since correspondences are not available, (3.1) cannot be used directly. However, individual point matches can be integrated out yielding the following integral equation [Tamas, Frohlich, Kato, 2014]:…”
Section: Absolute Pose Of Spherical Camerasmentioning
confidence: 99%
“…To get an explicit formula for the above surface integrals, the spherical patches D S and F S can be naturally parametrized via Φ and Ψ over the planar regions D and F. Without loss of generality, we can assume that the third coordinate of X ∈ F is 0, hence D ⊂ R 2 , F ⊂ R 2 ; and ∀X S ∈ D S : X S = Φ(x), x ∈ D as well as ∀Z S ∈ F S : Z S = Ψ(X), X ∈ F yielding the following form of (3.2) [Tamas, Frohlich, Kato, 2014]:…”
Section: Absolute Pose Of Spherical Camerasmentioning
confidence: 99%