Human-automation interaction (HAI) takes place in virtually every high-technology domain under a variety of operational conditions. Because operators make HAI decisions such as which mode to use, and when to engage, disengage, monitor, or cross-check automation, it is important to understand their perceptions of how system and situational characteristics affect their interaction with automation. The objective of this study was to examine how systematic variations of automation interface, task and context features influence professional pilots' judgments of HAI situations. Pilots received descriptions of crews interacting with flight deck automation in specific situations and were asked to rate cognitive demands and predict behaviors. Results reflect a complex interplay among automation features, task, and context. Automation features influenced judgments of workload, task management, and potential for automation-related errors; however, the impact of automation on situation awareness seems to be moderated by task features. Unanticipated tasks had broader effects on pilots' judgments than operational stressors. Results suggest that although changes to automated systems may be small in technical terms, their cognitive and behavioral impact on operators may be significant. Performance effects of automation changes in aviation as well as other domains need to be addressed with reference to task characteristics and situational demands.
This report investigates the use of design decision support software by engineering design students, who completed two unfamiliar design tasks using different visualization schemes in a design decision support tool. A mixed methods analysis is used to compare the ability to optimize on multiple dimensions using discrete and continuous design decision support visualization schemes. The analysis considered the qualitative data that was self-reported by the participants in conjunction with the qualitative performance data collected by the researchers to determine the ability of the tool to support design decision making. Results suggest that contrary to self-reported familiarity with both types of visualization schemes, participants were much more likely to optimize on multiple different criteria when using discrete visualization tools than when continuous tools were used. These results, which are supported by several different pieces of qualitative data, provide insight on design decision support tools and help to provide recommendations for future tools. I. Introductionhe engineering design process can be described as a problem in decision making that consists of four main steps: generating alternative designs, evaluating the different alternatives, ranking the alternatives based on the utility of each, and selecting the most highly ranked alternative for development. For most engineering product designs, this process is complex and non-trivial. While the design process can be conducted by optimizing based on a single function or value, it is common to design large-scale engineering systems through a formal systems engineering process, which takes into account multiple requirements and objectives 1 . This method allows engineers to optimize one or more metrics simultaneously, thereby attempting to achieve a balance of economic, performance, and other metrics. Furthermore, engineers use trade studies to search the design space for the best alternative, which is then presented to stakeholders and other engineers. A trade study is a technique of systematically generating alternative designs, evaluating metrics of interest for each alternative, and deducing how the metrics trend with the design variables and with each other. The final decision process that follows the trade study is aided by anecdotal evidence and historic statistical data 2 . The results of engineering trade studies are commonly depicted visually to designers in the form of familiar engineering graphics such as line, carpet, or contour plots 3, 4 . While traditional trade studies produce graphics showing the effect of varying one or two design parameters on a single metric of interest, advances in computing provide new ways to organize engineering analyses and visualize the results. Whereas past plots were created by a computer or by hand and were static, plots created by computers today can be dynamic and updated in real time by engineers, allowing the users to more fully explore the results and comprehend the design problem.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.