2016
DOI: 10.1177/1064804615611274
|View full text |Cite
|
Sign up to set email alerts
|

Trust-Based Analysis of an Air Force Collision Avoidance System

Abstract: This case study analyzes the factors that influence trust and acceptance among users (in this case, test pilots) of the Air Force’s Automatic Ground Collision Avoidance System. Our analyses revealed that test pilots’ trust depended on a number of factors, including the development of a nuisance-free algorithm, designing fly-up evasive maneuvers consistent with a pilot’s preferred behavior, and using training to assess, demonstrate, and verify the system’s reliability. These factors are consistent with… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0
1

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(24 citation statements)
references
References 11 publications
0
22
0
1
Order By: Relevance
“…Importantly, the trust in automation literature has focused on transparency as a key antecedent of trust in automation. For example, Lyons et al [29] found trust in an automatic ground collision avoidance system in fighter jets increased when information about the functioning of the system was displayed on the screen. Similarly, Alarcon et al [16] found transparency was a key factor in whether programmers would reuse code that was generated by other programmers, and research has consistently demonstrated transparency as a key factor in perceptions of software [14,20].…”
Section: Commentsmentioning
confidence: 99%
See 1 more Smart Citation
“…Importantly, the trust in automation literature has focused on transparency as a key antecedent of trust in automation. For example, Lyons et al [29] found trust in an automatic ground collision avoidance system in fighter jets increased when information about the functioning of the system was displayed on the screen. Similarly, Alarcon et al [16] found transparency was a key factor in whether programmers would reuse code that was generated by other programmers, and research has consistently demonstrated transparency as a key factor in perceptions of software [14,20].…”
Section: Commentsmentioning
confidence: 99%
“…One alternative may also be that programmers do not understand how GenProg operates "under the hood." Specifically, the process by which GenProg uses extant information and creates patches that pass test cases may not be understood by programmers, and this lack of transparency leads to a lack of trust [29]. Future research should investigate whether transparency is indeed the most important factor leading to differences between trust towards human and automated repair tools in software evaluation contexts.…”
Section: Sourcementioning
confidence: 99%
“…It is interesting to compare the development of autonomous (road) driving with existing technologies and concepts in other areas of transportation: Within the airline as well as rail and shipping industries, partly automated and autonomous piloting is a long-standing and accepted situation (see e.g., [67,68]). Therefore, it is astounding that current expectations regarding autonomous driving within road transportation and trucking are on the one hand met with many reservations and on the other hand extreme job losses are feared-in spite of the fact that in the airline, railway or shipping business such job losses have never been happened.…”
Section: Theory Frameworkmentioning
confidence: 99%
“…This is caused by effects of over trust and overreliance on the automation (Parasuraman & Riley, 1997), where even when the human is monitoring the system, he/she assumes it will not fail. One example that can be seen of previous studies about this phenomenon was reported by Lyons et al (2016) on the field of military aviation, where the researchers found that pilots tend to push over the safety margins of the aircraft, the more reliable its automated control system is.…”
Section: Ultimately Thomas Apud Prinzel Et Al (2001) Defines As a Smentioning
confidence: 86%
“…As already observed on the previous chapter in the OODA LOOP model, this is an interactive process, deeply related to the operator's past experience with the system, which reaffirms the dynamic aspect of the automation bias and complacency (Manzey & Bahner, 2005). Other factors such as the system reliability and interface clarity also seem to interfere directly with the phenomena, as it deals directly with the operator's trust (Dixon & Wickens, 2006;Lyons et al 2016). Some authors, such as Jones (1992) believe that this is a fatalistic process, where every individual that is exposed to automation will eventually become complacent due to a lack of stimuli for their attention.…”
Section: Ultimately Thomas Apud Prinzel Et Al (2001) Defines As a Smentioning
confidence: 99%