2023
DOI: 10.3390/electronics12030537
|View full text |Cite
|
Sign up to set email alerts
|

Time Synchronization and Space Registration of Roadside LiDAR and Camera

Abstract: The sensing system consisting of Light Detection and Ranging (LiDAR) and a camera provides complementary information about the surrounding environment. To take full advantage of multi-source data provided by different sensors, an accurate fusion of multi-source sensor information is needed. Time synchronization and space registration are the key technologies that affect the fusion accuracy of multi-source sensors. Due to the difference in data acquisition frequency and deviation in startup time between LiDAR a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 39 publications
0
4
0
Order By: Relevance
“…The image that exactly represents the same scene point should be acquired within 33 milliseconds of the obtained 2D LiDAR scan. As mentioned in [24], the present work assumed that the time taken to capture one LiDAR scan is T L and for capturing camera images, it is T C . For 'm' LiDAR scans, the captured image frames are n T L T C…”
Section: Software Time Synchronizationmentioning
confidence: 99%
“…The image that exactly represents the same scene point should be acquired within 33 milliseconds of the obtained 2D LiDAR scan. As mentioned in [24], the present work assumed that the time taken to capture one LiDAR scan is T L and for capturing camera images, it is T C . For 'm' LiDAR scans, the captured image frames are n T L T C…”
Section: Software Time Synchronizationmentioning
confidence: 99%
“…For instance, In the method proposed in [5], a set of sensors, including one LiDAR and two cameras, undergoes internal calibration followed by an external calibration with another set of sensors, leading to significant accumulated errors. Similarly, approaches in [3], [6] employ Zhang's method [7] for camera calibration. Du et al [4], initially compute the homography matrix based on four corresponding points for camera calibration, followed by a spatio-temporal synchronization optimization model.…”
Section: Multi-sensor Spatial Synchronizationmentioning
confidence: 99%
“…Currently, there is a substantial research gap in the synchronization of multiple views and sensors in large-scale roadside deployments. Most existing studies focuses on experimental setups, as exemplified by Zheng et al's work [5], which primarily explores controlled scenarios [6], or single-view scenarios as discussed by Du et al [4]. Roadside spatial synchronization typically involves the CST, as shown in Fig.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation