2022
DOI: 10.1109/tits.2020.3044813
|View full text |Cite
|
Sign up to set email alerts
|

Introspective Failure Prediction for Autonomous Driving Using Late Fusion of State and Camera Information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 34 publications
0
5
0
Order By: Relevance
“…This is a way to identify multiple items only by giving the whole picture to CNN without raster scanning the image. YOLO (You Only Look Once) is a representative method [9] in which an object rectangle and an object group is given for each local region split by a 7 × 7 grid. First, feature maps are formed through convolution and pooling of raw photos.…”
Section: B Application Of Cnn To Object Detection Taskmentioning
confidence: 99%
“…This is a way to identify multiple items only by giving the whole picture to CNN without raster scanning the image. YOLO (You Only Look Once) is a representative method [9] in which an object rectangle and an object group is given for each local region split by a 7 × 7 grid. First, feature maps are formed through convolution and pooling of raw photos.…”
Section: B Application Of Cnn To Object Detection Taskmentioning
confidence: 99%
“…Instead of restricting the system, our work aims to refine the system in order to allow it to handle as many scenarios as possible. In works such as [21], [22] or [23], future disengagements of an autonomous system were predicted by monitoring the input and output of the car. In this work, we show that external factors can also be used to predict system performance using a straightforward decision tree classifier.…”
Section: Related Workmentioning
confidence: 99%
“…Sensor fusion involves the close knitted integration and processing of data from multiple sensors for a more comprehensive and accurate understanding of the environment. It is a crucial aspect of autonomous driving technology. , A variety of sensors, including light detection and ranging (LiDAR), millimeter-wave radar (Radar), and cameras, are extensively utilized to capture diverse information about the vehicle’s surroundings. However, individual sensors are easily subject to various interferences such as changing weather conditions, electromagnetic disturbances, laser obstruction etc., significantly affecting the measurement accuracy and reliability of the entire system. , Late fusion in autonomous driving involves processing data from various sensors independently and merging their outputs at a later stage . It is particularly useful in addressing the aforementioned challenges and is expected to revolutionize future autonomous driving technology.…”
Section: Introductionmentioning
confidence: 99%