Augmented reality (AR) is rapidly being adopted by industry leaders and militaries around the globe. With the Defense Health Agency pushing AR as a solution to the distributed learning problem, along with AR applications being explored within primary care and operational medical settings, it is crucial for these immersive platforms to have a standardized, scientifically based paradigm on which they are designed and used. One area of particular concern is the potential for physiological maladaptation following prolonged AR exposure, which is expected to vary from that associated with virtual reality exposure. Such maladaptation is potentially driven by limitations that exist with regard to the types and extent of perceptual issues characteristic of AR head-worn displays (e.g., mismatches between visually displayed information and other senses, restricted field of view, mismatched interpupillary distance). Associated perceptual limitations can reduce training effectiveness or impose patient and/or trainee safety concerns. Thus, while AR technology has the potential to advance simulation training, there is a need to approach AR-based research—particularly that which relates to long-exposure-duration scenarios—from a bottom-up perspective, where its physiological impact is more fully understood. In the hopes of assisting this process, this study presents a comparison of cybersickness between two common forms of AR displays. Specifically, by comparing the Microsoft HoloLens, a head-worn display that has seen rapid adoption by the scientific community, with an AR Tablet–based platform within the context of long-duration AR training exposure, it will be possible to determine what differences, if any, exist between the two display platforms in terms of their physiological impact as measured via cybersickness severity and symptom profile. Results from this psychometric assessment will be used to evaluate the physiological impact of AR exposure and develop usage protocols to ensure AR is safe and effective to use for military medical training.
While virtual, augmented, and mixed reality technologies are being used for military medical training and beyond, these component technologies are oftentimes utilized in isolation. eXtended Reality (XR) combines these immersive form factors to support a continuum of virtual training capabilities to include full immersion, augmented overlays that provide multimodal cues to personalize instruction, and physical models to support embodiment and practice of psychomotor skills. When combined, XR technologies provide a multi-faceted training paradigm in which the whole is greater than the sum of the constituent capabilities in isolation. When XR applications are adaptive, and thus vary operational stressors, complexity, learner assistance, and fidelity as a function of trainee proficiency, substantial gains in training efficacy are expected. This paper describes a continuum of XR technologies and how they can be coupled with numerous adaptation strategies and supportive artificial intelligence (AI) techniques to realize personalized, competency-based training solutions that accelerate time to proficiency. Application of this training continuum is demonstrated through a Tactical Combat Casualty Care training use case. Such AI-enabled XR training solutions have the potential to support the military in meeting their growing training demands across military domains and applications, and to provide the right training at the right time.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.