2019
DOI: 10.1177/0954407019859821
|View full text |Cite
|
Sign up to set email alerts
|

Low-observable targets detection for autonomous vehicles based on dual-modal sensor fusion with deep learning approach

Abstract: Environment perception is a basic and necessary technology for autonomous vehicles to ensure safety and reliable driving. A lot of studies have focused on the ideal environment, while much less work has been done on the perception of low-observable targets, features of which may not be obvious in a complex environment. However, it is inevitable for autonomous vehicles to drive in environmental conditions such as rain, snow and night-time, during which the features of the targets are not obvious and detection m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 30 publications
0
8
0
Order By: Relevance
“…Localization 9 Global navigation satellite systems [18], [53] Simultaneous localization and mapping [52], [54], [55], [56] A priori map-based localization [57], [58], [50] Perception 16 Object detection and classification [60], [47], [48], [63] Road and obstacle detection [64], [65], [66], [68] [69], [70], [71], [72], [73], [47], [55], [49] Control & task execution [74], [75] 2…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Localization 9 Global navigation satellite systems [18], [53] Simultaneous localization and mapping [52], [54], [55], [56] A priori map-based localization [57], [58], [50] Perception 16 Object detection and classification [60], [47], [48], [63] Road and obstacle detection [64], [65], [66], [68] [69], [70], [71], [72], [73], [47], [55], [49] Control & task execution [74], [75] 2…”
Section: Resultsmentioning
confidence: 99%
“…The result was a reliable generalized obstacle detection and object classification solution that removes the need to annotate new training data to overfit a certain environment. A solution specifically designed for bad weather and bad lighting condition is proposed in [63]. The system fuses color camera images with infrared camera images to establish a dual-modal optical sensor to attain better detection robustness of low-observable targets (LOT).…”
Section: ) Object Detection and Classificationmentioning
confidence: 99%
“…This model is able to produce a more accurate prediction using joint learning because joint learning generates more data for safely operating vehicles compared to learning environmental data and driving policy independently. In [41], the authors propose a dual-modal DNN to create an improved detection model in severe environmental conditions such as rain, snow, and night-time where features can be blurry. This network fuses color and infrared images and achieves improved performance for low-observable targets.…”
Section: Deep Learning For Sensor Fusionmentioning
confidence: 99%
“…In the context of intelligent vehicles, environmental perception crucially forms the basis for decision-making and motion planning, with vehicles being the prominent participants in traffic and the fundamental objects of environmental perception. 1 With the development of sensor and computer technology, the vehicle detection performance has been significantly improved. 2,3 In a good environment, the speed and accuracy of vehicle detection can well meet the requirements of intelligent vehicle.…”
Section: Introductionmentioning
confidence: 99%