Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2022
DOI: 10.1109/mits.2021.3093379
|View full text |Cite
|
Sign up to set email alerts
|

On-Road Object Detection and Tracking Based on Radar and Vision Fusion: A Review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 54 publications
(18 citation statements)
references
References 159 publications
0
9
0
Order By: Relevance
“…However, they do not cover the entire near-field, and there are some blind spots. Radar is additionally limited in that it cannot detect road markings and has limited performance in object classification [26]. Parking space detection using SRR is discussed in more detail in [27].…”
Section: B Relation To Other Sensorsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, they do not cover the entire near-field, and there are some blind spots. Radar is additionally limited in that it cannot detect road markings and has limited performance in object classification [26]. Parking space detection using SRR is discussed in more detail in [27].…”
Section: B Relation To Other Sensorsmentioning
confidence: 99%
“…Varga et al [30] have attempted to combine fisheye camera and LiDAR to provide a unified 360 • environmental model, but there are blind spots in the near field. Classification of objects in LiDAR has extremely limited performance [26]. To summarize, other near-field sensors like radar and sonar capture limited information about the scene, and thus they cannot operate independently to perform near-field perception.…”
Section: B Relation To Other Sensorsmentioning
confidence: 99%
“…As shown in Figure 3, the input of the integrated framework is the COM trajectory motion reference, which can be generated from the road terrains based on radar and vision fusion. 33 In this paper, all four wheels are maintained in contact with the ground under traveling on different terrains. To clearly describe the parameters of the attitude control, we define the other corner notations for the relevant parameters, where the right-hand corner cmd denotes the control command parameters and the right-hand corner fb and ff denote the feedback parameter and feedforward parameter.…”
Section: Attitude Controller Designmentioning
confidence: 99%
“…Besides the aforementioned applications, radars have also been used for vehicle position estimation [10]- [14], and detection of other vehicles aided by Light Detection And Ranging (LiDAR) sensors [15] or cameras [16]. Usually, combining radar scans with other sensors is interesting given that their operating conditions are complementary [17].…”
Section: Introductionmentioning
confidence: 99%