2016
DOI: 10.18178/joace.4.6.460-466
|View full text |Cite
|
Sign up to set email alerts
|

Multi-sensor Fusion Module in a Fault Tolerant Perception System for Autonomous Vehicles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…Liu et al 2 used an artificial intelligence methodology to carry out fault diagnosis for rotating machinery after broadly reviewing literature relevant to industrial applications. Realpe et al 3 used multi-sensor fusion architecture and the support vector machine algorithm to minimize the influence of sensor faults in an autonomous vehicle. Janssens et al 4 applied a convolutional neural network to the fault detection of rotating machinery using vibration analysis.…”
Section: Introductionmentioning
confidence: 99%
“…Liu et al 2 used an artificial intelligence methodology to carry out fault diagnosis for rotating machinery after broadly reviewing literature relevant to industrial applications. Realpe et al 3 used multi-sensor fusion architecture and the support vector machine algorithm to minimize the influence of sensor faults in an autonomous vehicle. Janssens et al 4 applied a convolutional neural network to the fault detection of rotating machinery using vibration analysis.…”
Section: Introductionmentioning
confidence: 99%
“…3 Lane detection fault from the vision sensors in commercial level, as covered in this study, may often occur because of internal/external factors. However, unlike with obstacle detection techniques using various sensors such as radar and lidar detectors, 4 it is difficult to design hardware redundancy for lane detection technology using various sensors because the alternative sensors used in lane detection technology are limited. For this reason, analytical sensor redundancy is more acceptable than hardware sensor redundancy when applied to lane detection function.…”
Section: Introductionmentioning
confidence: 99%
“…In another study, the Kalman filter and discrete wavelet transform were proposed as rain removal algorithms using the You Only Look Once (YOLOv3) methodology for the camera FDI of AVs [54]. Realpe et al [55] proposed a sensor fusion framework that integrates data from a unified configuration with sensor weight provided in a real-time FDI using the SVM algorithm to reduce the impact of sensor faults. Although Google, GM, BMW, and Tesla are trying and testing various AVs, multiple issues, such as bugs in traditional software, are a challenging task to fix, when compared with the DNN-based software [56].…”
Section: Phm Of Vision Sensorsmentioning
confidence: 99%
“…For example, even after applying Velodyne's correction factors and distance offset calibrated using readings from another reference LiDAR sensor, points with uncertainties of the order of 30 cm were noted [76]. A sensor fusion design that utilizes a support vector machine (SVM) technique to integrate data from a federalized fusion framework with sensor weight feedback data provided in real-time by the fault detection and diagnosis module was used to reduce the impact of sensor faults [55]. Duran et al [14] enlisted various kinds of faults in the LiDAR system, along with the severity level of fault occurrence in any particular component of the LiDAR system.…”
Section: Fault Categorymentioning
confidence: 99%
See 1 more Smart Citation