2020 IEEE International Conference on Edge Computing (EDGE) 2020
DOI: 10.1109/edge50951.2020.00009
|View full text |Cite
|
Sign up to set email alerts
|

A Camera–Radar Fusion Method Based on Edge Computing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 31 publications
(12 citation statements)
references
References 7 publications
0
11
0
Order By: Relevance
“…[35], a new method for calculating intersection phase time series based on interval detection data from millimeter-wave radars reduces vehicle congestion at intersections. And some systems based on the fusion of millimeter-wave radar and camera have been proposed to improve the robustness of vehicle information awareness on urban roads [36,37]. However, these studies based on the millimeter-wave radar to achieve traffic congestion relief are presented in simulation or approximate real simulation.…”
Section: Millimeter Wave Radar-based Road Application and Congestion ...mentioning
confidence: 99%
“…[35], a new method for calculating intersection phase time series based on interval detection data from millimeter-wave radars reduces vehicle congestion at intersections. And some systems based on the fusion of millimeter-wave radar and camera have been proposed to improve the robustness of vehicle information awareness on urban roads [36,37]. However, these studies based on the millimeter-wave radar to achieve traffic congestion relief are presented in simulation or approximate real simulation.…”
Section: Millimeter Wave Radar-based Road Application and Congestion ...mentioning
confidence: 99%
“…With the continuous deepening of research, some scholars have tried to deploy radar and camera perception equipment on the roadside to perceive the surrounding environment and provide more complete perception information for autonomous vehicles through wireless communication technology. Fu et al [ 47 ] utilized edge computing to localize the environmental perception data to effectively reduce latency. Meanwhile, the YOLOv3 [ 48 ] and DBSCAN clustering algorithms [ 49 ], often deployed in roadside equipment, are used to preprocess camera and radar data, respectively, to obtain information such as the position, speed, and category of targets.…”
Section: Cooperative Perception Information Fusionmentioning
confidence: 99%
“…However, the real positional detection was based on maize stems in this study, such leave reflected points made the calculated position deviate from the real position, too. Besides, although there did not exist visible positional changes between the camera and the Lidar, their positional variations were inevitable, some studies hold the viewpoint that vibration-caused positional changes between the camera and the Lidar is also a non-negligible reason for error occurrence [36] .…”
Section: Maize Seedling Detecting Accuracymentioning
confidence: 99%