2012 IEEE International Conference on Robotics and Automation 2012
DOI: 10.1109/icra.2012.6224607
|View full text |Cite
|
Sign up to set email alerts
|

Lost in translation (and rotation): Rapid extrinsic calibration for 2D and 3D LIDARs

Abstract: This paper describes a novel method for determining the extrinsic calibration parameters between 2D and 3D LIDAR sensors with respect to a vehicle base frame. To recover the calibration parameters we attempt to optimize the quality of a 3D point cloud produced by the vehicle as it traverses an unknown, unmodified environment. The point cloud quality metric is derived from Rényi Quadratic Entropy and quantifies the compactness of the point distribution using only a single tuning parameter. We also present a fas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
58
0

Year Published

2013
2013
2018
2018

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 77 publications
(58 citation statements)
references
References 26 publications
0
58
0
Order By: Relevance
“…Similarly, Sheehan et al [18] demonstrate unsupervised calibration of a custom Velodyne-like sensor, making use of the known spinning motion of the sensor base to automatically calibrate beam angles, relative laser positions, and timing offsets. In a similar vein, there are several unsupervised methods that demonstrate extrinsic calibration between multiple depth sensors by making use of sensor motion [2,17]. In the realm of RGB-only cameras, Carrera et al [3] show extrinsic calibration between several cameras on a moving platform, and Civera et al [4] compute camera intrinsics using structure from motion.…”
Section: Related Workmentioning
confidence: 99%
“…Similarly, Sheehan et al [18] demonstrate unsupervised calibration of a custom Velodyne-like sensor, making use of the known spinning motion of the sensor base to automatically calibrate beam angles, relative laser positions, and timing offsets. In a similar vein, there are several unsupervised methods that demonstrate extrinsic calibration between multiple depth sensors by making use of sensor motion [2,17]. In the realm of RGB-only cameras, Carrera et al [3] show extrinsic calibration between several cameras on a moving platform, and Civera et al [4] compute camera intrinsics using structure from motion.…”
Section: Related Workmentioning
confidence: 99%
“…By this approach, an accuracy close to the measurement accuracy of the utilized laser sensor (0.05 mm) is achieved. Instead of improving the accuracy of the calibration, the matching of single scans can also be corrected by using the Iterative Closest Point (ICP) algorithm as shown by Borangiu et al [12] or the point cloud entropy as shown by Maddern et al [14].…”
Section: B Laser Sensor Calibrationmentioning
confidence: 99%
“…This can be done in numerous ways, and can use more than one type of sensor. Some common extrinsic calibration procedures use a LiDAR-Camera procedure as outlined in [7][8][9][10], and multiple LiDAR sensors or multiple sensor views as illustrated by [11][12][13][14][15][16], of a fixed target structure for a faster extrinsic calibration prior to operations [17][18][19][20]. These scenarios require the sensors raw data to be correlated into one coherent picture.…”
Section: Related Workmentioning
confidence: 99%