PurposeComputer-aided navigation is widely used in ENT surgery. The position of a surgical instrument is shown in the CT/MR images of the patient and can thus be a good support for the surgeon. The accuracy is highly dependent on the registration done prior to surgery. A microscope and a probe can both be used for registration and navigation, depending on the surgical intervention. A navigation system typically only reports the fiducial registration error after paired-point registration. However, the target registration error (TRE)—a measurement for the accuracy in the surgical area—is much more relevant. The aim of this work was to compare the performance of a microscope relative to a conventional probe-based approach with different registration methods.MethodsIn this study, optical tracking was used to register a plastic skull to its preoperative CT images with paired-point registration. Anatomical landmarks and skin-affixed markers were used as fiducials and targets. With both microscope and probe, four different registration methods were evaluated based on their TREs at 10 targets. For half of the experiments, a surface registration and/or external fiducials were used additionally to paired-point registration to study their influence to accuracy.ResultsOverall, probe registration leads to a smaller TRE () than registration with a microscope (). Additional surface registration does not result in better accuracy of navigation for microscope and probe. The lowest mean TRE for both pointers can be achieved with paired-point registration only and radiolucent markers.ConclusionOur experiments showed that a probe used for registration and navigation achieves lower TREs compared using a microscope. Neither additional surface registration nor additional fiducials on an external reference element are necessary for improved accuracy of navigated ENT surgery on a plastic skull.
Purpose Interactive image-guided surgery technologies enable accurate target localization while preserving critical nearby structures in many surgical interventions. Current state-of-the-art interfaces largely employ traditional anatomical cross-sectional views or augmented reality environments to present the actual spatial location of the surgical instrument in preoperatively acquired images. This work proposes an alternative, simple, minimalistic visual interface intended to assist during real-time surgical target localization. Methods The estimated 3D pose of the interventional instruments and their positional uncertainty are intuitively presented in a visual interface with respect to the target point. A usability study with multidisciplinary participants evaluates the proposed interface projected in surgical microscope oculars against cross-sectional views. The latter was presented on a screen both stand-alone and combined with the proposed interface. The instruments were electromagnetically navigated in phantoms. Results The usability study demonstrated that the participants were able to detect invisible targets marked in phantom imagery with significant enhancements for localization accuracy and duration time. Clinically experienced users reached the targets with shorter trajectories. The stand-alone and multi-modal versions of the proposed interface outperformed cross-sectional views-only navigation in both quantitative and qualitative evaluations. Conclusion The results and participants’ feedback indicate potential to accurately navigate users toward the target with less distraction and workload. An ongoing study evaluates the proposed system in a preclinical setting for auditory brainstem implantation.
Background Manual paired‐point registration for navigated ENT‐surgery is prone to human errors; automatic surface registration is often caught in local minima. Methods Anatomical features of the human occiput are integrated into an algorithm for surface registration. A vector force field is defined between the patient and operating room datasets; registration is facilitated through gradient‐based vector field analysis optimization of an energy function. The method is validated exemplarily on patient surface data provided by a mechanically positioned A‐mode ultrasound sensor. Results Successful registrations were achieved within the entire parameter space, as well as from positions of local minima that were found by the Gaussian fields algorithm for surface registration. Sub‐millimetric registration error was measured in clinically relevant anatomical areas on the anterior skull and within the generally accepted margin of 1.5 mm for the entire head. Conclusion The satisfactory behavior of this approach potentially suggests a wider clinical integration.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.