Purpose: As the trend toward minimally invasive and percutaneous interventions continues, the importance of appropriate surgical data visualization becomes more evident. Ineffective interventional data display techniques that yield poor ergonomics that hinder hand-eye coordination, and therefore promote frustration which can compromise on-task performance up to adverse outcome. A very common example of ineffective visualization is monitors attached to the base of mobile C-arm X-ray systems. Methods: We present a spatially and imaging geometry-aware paradigm for visualization of fluoroscopic images using Interactive Flying Frustums (IFFs) in a mixed reality environment. We exploit the fact that the C-arm imaging geometry can be modeled as a pinhole camera giving rise to an 11-degree-of-freedom view frustum on which the X-ray image can be translated while remaining valid. Visualizing IFFs to the surgeon in an augmented reality environment intuitively unites the virtual 2D X-ray image plane and the real 3D patient anatomy. To achieve this visualization, the surgeon and C-arm are tracked relative to the same coordinate frame using image-based localization and mapping, with the augmented reality environment being delivered to the surgeon via a state-of-the-art optical see-through head-mounted display. Results: The root-mean-squared error of C-arm source tracking after hand-eye calibration was determined as 0.43° ± 0.34° and 4.6 ± 2.7 mm in rotation and translation, respectively. Finally, we B Javad Fotouhi,
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.