We studied the transparency of automated tools used during emergency operations in commercial aviation. Transparency (operationalized as increasing levels of explanation associated with an automated tool recommendation) was manipulated to evaluate how transparent interfaces influence pilot trust of an emergency landing planning aid. We conducted a low-fidelity study in which commercial pilots interacted with simulated recommendations from NASA’s Emergency Landing Planner (ELP) that varied in their associated levels of transparency. Results indicated that trust in the ELP was influenced by the level of transparency within the human–machine interface of the ELP. Design recommendations for automated systems are discussed.
Objectives:To investigate existing knowledge in the literature about end-of-life decision making by family caregivers of persons with dementia, focusing on decision aids for caregivers of persons with advanced dementia, and to identify gaps in the literature that can guide future research.Methods:A literature review through systematic searches in PubMed, CINAHL Plus with Full Text, and PsycINFO was conducted in February 2018; publications with full text in English and published in the past 10 years were selected in multiple steps.Results:The final sample included five decision aids with predominantly Caucasian participants; three of them had control groups, and three used audiovisual technology in presenting the intervention materials. No other technology was used in any intervention. Existing interventions lacked tailoring of information to caregivers’ preferences for different types and amounts of information necessary to make decisions consistent with patients’ values.Conclusion:Research is needed in exploring the use of technology in decision aids that could provide tailored information to facilitate caregivers’ decision making. More diverse samples are needed.
This case study analyzes the factors that influence trust and acceptance among users (in this case, test pilots) of the Air Force’s Automatic Ground Collision Avoidance System. Our analyses revealed that test pilots’ trust depended on a number of factors, including the development of a nuisance-free algorithm, designing fly-up evasive maneuvers consistent with a pilot’s preferred behavior, and using training to assess, demonstrate, and verify the system’s reliability. These factors are consistent with the literature on trust in automation and could lead to best practices for automation design, testing, and acceptance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.