2022
DOI: 10.1109/tits.2021.3109555
|View full text |Cite
|
Sign up to set email alerts
|

Safety Implications of Variability in Autonomous Driving Assist Alerting

Abstract: Advanced Driving Assist Systems (ADAS) are on the rise in new cars, including versions that embed artificial intelligence in computer vision systems that leverage deep learning algorithms. Because these systems, at the present time, cannot operate in all operational driving domains, they employ some type of driver monitoring system for assessing driver attention, so that drivers can effectively take control if and when an ADAS system can no longer control the car. To determine the reliability of a driver alert… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 15 publications
0
10
0
Order By: Relevance
“…A key component to understanding driver readiness is hand activity, as a distracted driver often has their hands off the wheel or on other devices like a phone. Rangesh et al (Rangesh et al, 2021) and Deo & Trivedi (Deo & Trivedi, 2019) show that driver hand activity is the most important component of models for prediction of driver readiness and takeover time, two metrics critical to safe control transitions in autonomous vehicles (Greer et al, 2023) (Cummings & Bauchwitz, 2021). Such driver-monitoring models take hand activity classes and held-object classes as input, among other components, as illustrated in Figure 1.…”
Section: Safety and Advanced Driver Assistance Systemsmentioning
confidence: 99%
“…A key component to understanding driver readiness is hand activity, as a distracted driver often has their hands off the wheel or on other devices like a phone. Rangesh et al (Rangesh et al, 2021) and Deo & Trivedi (Deo & Trivedi, 2019) show that driver hand activity is the most important component of models for prediction of driver readiness and takeover time, two metrics critical to safe control transitions in autonomous vehicles (Greer et al, 2023) (Cummings & Bauchwitz, 2021). Such driver-monitoring models take hand activity classes and held-object classes as input, among other components, as illustrated in Figure 1.…”
Section: Safety and Advanced Driver Assistance Systemsmentioning
confidence: 99%
“…However, the deployment of CNNs presents serious safety challenges as there is a significant lack of human interpretability or intuition for the functioning of these Neural Network (NN) [15], even in the case of visual imagery. This is compounded by the variance in perception suite performance, where it simultaneously out-and under-preforms human capability leading to a cognition gap [16,17] making it difficult for drivers/operators to predict the vehicle's (mis-) behaviour with respect to unseen or unforeseen scenarios and edge-cases [18]. Where CNNs are used for object detection and classification, e.g.…”
Section: Capabilities and Limitations Of Camera-based Perception Systemsmentioning
confidence: 99%
“…ADAS systems rely on computer vision-based lane tracking systems to maintain lateral control [3], and may combine computer vision with other sensing modalities to maintain space from other vehicles and detect obstacles [4][5][6]. However, recent research demonstrates that ADAS systems perform inconsistently in lateral and longitudinal control, including obstacle detection, warning and mitigations [7][8][9].…”
Section: Introductionmentioning
confidence: 99%