Current surgical navigation systems offer sub-millimetric real-time localization, however they are expensive, require the use of invasive markers attached to the patient, and often add extra operation time. In this paper we propose an affordable markerless navigation approach, based on mid end depth sensors, as an alternative to answer medical applications needs in terms of accuracy and robustness. An algorithm called Fast Volumetric Reconstruction (FaVoR) implements a compute-efficient approach for real time 3D model registration based tracking, allowing computed 3D poses to be used for video scene augmentation. After early testing with a first proof-of-concept implementation, a preliminary accuracy evaluation was performed using a dynamic test bench, achieving an average 2mm registration error during tracking.
Purpose Ability to locate and track ultrasound images in the 3D operating space is of great benefit for multiple clinical applications. This is often accomplished by tracking the probe using a precise but expensive optical or electromagnetic tracking system. Our goal is to develop a simple and low cost augmented reality echography framework using a standard RGB-D Camera. Methods A prototype system consisting of an Occipital Structure Core RGB-D camera, a specifically-designed 3D marker, and a fast point cloud registration algorithm FaVoR was developed and evaluated on an Ultrasonix ultrasound system. The probe was calibrated on a 3D-printed N-wire phantom using the software PLUS toolkit. The proposed calibration method is simplified, requiring no additional markers or sensors attached to the phantom. Also, a visualization software based on OpenGL was developed for the augmented reality application. ResultsThe calibrated probe was used to augment a real-world video in a simulated needle insertion scenario. The ultrasound images were rendered on the video, and visually-coherent results were observed. We evaluated the end-to-end accuracy of our AR US framework on localizing a cube of 5 cm size. From our two experiments, the target pose localization error ranges from 5.6 to 5.9 mm and from −3.9 • to 4.2 • . ConclusionWe believe that with the potential democratization of RGB-D cameras integrated in mobile devices and AR glasses in the future, our prototype solution may facilitate the use of 3D freehand ultrasound in clinical routine. Future work should include a more rigorous and thorough evaluation, by comparing the calibration accuracy with those obtained by commercial tracking solutions in both simulated and real medical scenarios. KeywordsUltrasound • Augmented reality • Probe calibration • RGB-D camera • 3D printing • Optical tracking system * This work benefited from the European Unions Horizon 2020 research and innovation program under grant agreement n856950 (5G-TOURS project). Also, it benefited from State aid managed by the National Research Agency (FR) under the future investment program bearing the reference ANR-17-RHUS-0005 (FollowKnee project).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.