2002
DOI: 10.1016/s0951-8320(01)00094-1
|View full text |Cite
|
Sign up to set email alerts
|

Testing and implementing cockpit alerting systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(19 citation statements)
references
References 18 publications
0
19
0
Order By: Relevance
“…Radiologists, for example, are known to be concerned with explaining why each prompt is present [37,41]. Also, in aviation, pilots using TCAS (Traffic Collision Avoidance System) are strictly instructed to regard all automated messages as genuine alerts demanding an immediate, high-priority response [42]. Processing false prompts demands time and cognitive resources, and thus can lead to "Time Pressure" (node 9) and "Cognitive Overload" (node 5: presence of confusion that does not allow the operator to process information properly).…”
Section: A Case Study: Computer Aided Detection (Cad) For Mammographymentioning
confidence: 99%
“…Radiologists, for example, are known to be concerned with explaining why each prompt is present [37,41]. Also, in aviation, pilots using TCAS (Traffic Collision Avoidance System) are strictly instructed to regard all automated messages as genuine alerts demanding an immediate, high-priority response [42]. Processing false prompts demands time and cognitive resources, and thus can lead to "Time Pressure" (node 9) and "Cognitive Overload" (node 5: presence of confusion that does not allow the operator to process information properly).…”
Section: A Case Study: Computer Aided Detection (Cad) For Mammographymentioning
confidence: 99%
“…Once a fault has been perceived (whether the perception is right or wrong), the remainder of the procedure may be stereotyped as untrustworthy instead of each step being judged for its individual merits. This effect is similar to the effects of automation faults on operator trust and reliance (Lee & Moray, 1994;Muir, 1987) and of false alarms on operator conformance to subsequent alerting system commands (Hasse, 1992;Pritchett, Vándor, & Edwards, 1999).…”
Section: Mitigating Intentional Noncompliancementioning
confidence: 61%
“…A second objective, also related to reducing uncertainty in the cockpit, is the objective not to have conflicting alarms -also known as dissonance [37], [38]. This objective can unfortunately be derived from a lessons learned from a tragic flight accident.…”
Section: Flight Definitionmentioning
confidence: 99%