2021 IEEE International Conference on Robotics and Automation (ICRA) 2021
DOI: 10.1109/icra48506.2021.9561190
|View full text |Cite
|
Sign up to set email alerts
|

CAROM - Vehicle Localization and Traffic Scene Reconstruction from Monocular Cameras on Road Infrastructures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
2

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(15 citation statements)
references
References 31 publications
0
5
2
Order By: Relevance
“…This result is better than the previously reported results by Lu et al [ 21 ], which was described in Section 2 . Their results showed that the average localization error was 1.81 m using a differential GPS as a reference and 1.68 m using a drone as a reference.…”
Section: Validation Of Vehicle Localization Error Correctioncontrasting
confidence: 83%
See 2 more Smart Citations
“…This result is better than the previously reported results by Lu et al [ 21 ], which was described in Section 2 . Their results showed that the average localization error was 1.81 m using a differential GPS as a reference and 1.68 m using a drone as a reference.…”
Section: Validation Of Vehicle Localization Error Correctioncontrasting
confidence: 83%
“…In the table, it is clearly seen that the errors corrected by the linear regression model are much smaller than those without any correction with an average improvement of 71.32%. This result is better than the previously reported results by Lu et al [21], which was described in Section 2. Their results showed that the average localization error was 1.81 m using a differential GPS as a reference and 1.68 m using a drone as a reference.…”
Section: Vehicle Localization In Dynamic Conditioncontrasting
confidence: 79%
See 1 more Smart Citation
“…Currently, there exist few publications on vehicle detection and tracking using roadside sensors for traffic-monitoring applications. One such method, presented in [ 21 ], employs a sensor fusion algorithm that combines data from multiple sensors using an extended Kalman filter algorithm, resulting in a lateral error of 0.53 m, longitudinal error of 1.19 m, and combined Euclidean error of 1.43 m. Another method, proposed in [ 36 ], uses multiple cameras to construct 3D bounding boxes around vehicles to determine the vehicles’ locations. The results showed an average localization error of 1.81 m in the Euclidean distance using a differential GPS as a reference and 1.68 m in the Euclidean distance using a drone as a reference.…”
Section: Resultsmentioning
confidence: 99%
“…Most roadside monocular localization methods [16], [31] rely on manually selecting at least four corresponding points between the image and the map, solving for the homography matrix, and achieving the conversion from pixel coordinates to world coordinates. However, this manual point selection process may introduce localization errors in practical scenarios.…”
Section: Spatial Synchronizationmentioning
confidence: 99%