Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2020
DOI: 10.1007/s12369-020-00694-1
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Estimation of Drivers’ Trust in Automated Driving Systems

Abstract: Trust miscalibration issues, represented by undertrust and overtrust, hinder the interaction between drivers and self-driving vehicles. A modern challenge for automotive engineers is to avoid these trust miscalibration issues through the development of techniques for measuring drivers’ trust in the automated driving system during real-time applications execution. One possible approach for measuring trust is through modeling its dynamics and subsequently applying classical state estimation methods. This paper p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(24 citation statements)
references
References 49 publications
0
24
0
Order By: Relevance
“…Other studies asked respondents to rate their level of trust and safety or changes in these using items such as: “To what extent do you trust the driving automation according to the previous performance of the system ? ” [ 81 ], “Ranked the buttons on safety perception scale” , [ 82 , p. 351], and “Please indicate the degree that your trust has changed after this encounter” [ 83 ]. These items were not tailored to the specific nature of partial automation requiring permanent supervision by human drivers.…”
Section: Discussionmentioning
confidence: 99%
“…Other studies asked respondents to rate their level of trust and safety or changes in these using items such as: “To what extent do you trust the driving automation according to the previous performance of the system ? ” [ 81 ], “Ranked the buttons on safety perception scale” , [ 82 , p. 351], and “Please indicate the degree that your trust has changed after this encounter” [ 83 ]. These items were not tailored to the specific nature of partial automation requiring permanent supervision by human drivers.…”
Section: Discussionmentioning
confidence: 99%
“…AVs have specific characteristics that make the study of trust in driver-AV interaction challenging. For instance, people have become comfortable driving "manually" for decades, and sometimes will refuse to use self-driving capabilities-such as cruise control, automatic lane keeping or lane departure warning-because they do not understand how those capabilities work, or what are their advantages and limitations [2,16]. Additionally, drivers usually share control with AVs, establishing a specific type of team collaboration.…”
Section: Driver-av Interaction Particularitiesmentioning
confidence: 99%
“…Experiments with a simulated SAE level 3 automated driving systems (ADS) provided the data used for model fitting. Eventually, using the estimation method, the AV was able to assess how much trust the drivers had in the AV's capabilities, based on how the drivers appeared to be splitting their attention between the driving task and the NDRT [2].…”
Section: My Prior Workmentioning
confidence: 99%
“…Recognizing human emotions, their experience, and their levels of comfort and stress while using AI devices or services can help to calibrate the functions to be performed and the decisions to be made by AI accordingly. For example if autonomous vehicles consistently make automated choices that follow passengers' expectations, they could create a higher level of trust, which is fundamental to promoting their acceptance [3], while failing to do so would instead lead to mistrust and stress.…”
Section: Introductionmentioning
confidence: 99%