In spinal fusion surgery, imprecise placement of pedicle screws can result in poor surgical outcome or may seriously harm a patient. Patient-specific instruments and optical system have been proposed for improving precision through surgical navigation compared to free-hand insertion. However, existing solutions are expensive and cannot provide in situ visualizations. Recent technological advancement enabled the production of more powerful and precise optical see-through head-mounted displays for the mass market. The purpose of this laboratory study was to evaluate whether such a device is sufficiently precise for the navigation of lumbar pedicle screw placement. Methods: A novel navigation method, tailored to run on the Microsoft HoloLens, was developed. It comprises capturing of the intraoperatively reachable surface of vertebrae to achieve registration and tool tracking with real-time visualizations without the need of intraoperative imaging. For both, surface sampling and navigation, 3D printable parts, equipped with fiducial markers, were employed. Accuracy was evaluated within a self-built setup based on two phantoms of the lumbar spine. Computed Tomography (CT) scans of the phantoms were acquired to carry out preoperative planning of screw trajectories in 3D. A surgeon placed
BACKGROUND CONTEXT Due to recent developments in augmented reality with headmounted devices, holograms of a surgical plan can be displayed directly in the surgeon's field of view. To the best of our knowledge, three dimensional (3D) intraoperative fluoroscopy has not been explored for the use with holographic navigation by head-mounted devices in spine surgery. PURPOSE To evaluate the surgical accuracy of holographic pedicle screw navigation by head-mounted device using 3D intraoperative fluoroscopy. STUDY DESIGN In this experimental cadaver study, the accuracy of surgical navigation using a head-mounted device was compared with navigation with a state-of-the-art posetracking system. METHODS Three lumbar cadaver spines were embedded in nontransparent agar gel, leaving only commonly visible anatomy in sight. Intraoperative registration of preoperative planning was achieved by 3D fluoroscopy and fiducial markers attached to lumbar vertebrae. Trackable custom-made drill sleeve guides enabled real-time navigation. In total, 20 K-wires were navigated into lumbar pedicles using AR-navigation, 10 K-wires by the state-of-the-art pose-tracking system. 3D models obtained from postexperimental CT scans were used to measure surgical accuracy. MF is the founder and shareholder of Incremed AG, a Balgrist University Hospital start-up focusing on the development of innovative techniques for surgical executions. The other authors declare no conflict of interest concerning the contents of this study. No external funding was received for this study. RESULTS No significant difference in accuracy was measured between AR-navigated drillings and the gold standard with pose-tracking system with mean translational errors between entry points (3D vector distance; p=.85) of 3.4±1.6 mm compared with 3.2±2.0 mm, and mean angular errors between trajectories (3D angle; p=.30) of 4.3°±2.3°compared with 3.5°±1.4°. CONCLUSIONS In conclusion, holographic navigation by use of a head-mounted device achieve accuracy comparable to the gold standard of high-end pose-tracking systems. CLINICAL SIGNIFICANCE These promising results could result in a new way of surgical navigation with minimal infrastructural requirements but now have to be confirmed in clinical studies.
Background Accurate glenoid positioning in reverse total shoulder arthroplasty (RSA) is important to achieve satisfying functional outcome and prosthesis longevity. Optimal component placement can be challenging, especially in severe glenoid deformities. The use of patient-specific instruments (PSI) and 3D computer-assisted optical tracking navigation (NAV) are already established methods to improve surgical precision. Augmented reality technology (AR) promises similar results at low cost and ease of use. With AR, the planned component placement can be superimposed to the surgical situs and shown directly in the operating field using a head mounted display. We introduce a new navigation technique using AR via head mounted display for surgical navigation in this feasibility study, aiming to improve and enhance the surgical planning. Methods 3D surface models of ten human scapulae were printed from computed tomography (CT) data of cadaver scapulae. Guidewire positioning of the central back of the glenoid baseplate was planned with a dedicated computer software. A hologram of the planned guidewire with dynamic navigation was then projected onto the 3D-created models of the cadaver shoulders. The registration of the plan to the anatomy was realized by digitizing the glenoid surface and the base of the coracoid with optical tracking using a fiducial marker. After navigated placement of the central guidewires, another CT imaging was recorded, and the 3D model was superimposed with the preoperative planning to analyze the deviation from the planned and executed central guides trajectory and entry point. Results The mean deviation of the ten placed guidewires from the planned trajectory was 2.7° ± 1.3° (95% CI 1.9°; 3.6°). The mean deviation to the planned entry point of the ten placed guidewires measured 2.3 mm ± 1.1 mm (95% CI 1.5 mm; 3.1 mm). Conclusion AR may be a promising new technology for highly precise surgical execution of 3D preoperative planning in RSA.
Background Existing surgical navigation approaches of the rod bending procedure in spinal fusion rely on optical tracking systems that determine the location of placed pedicle screws using a hand‐held marker. Methods We propose a novel, marker‐less surgical navigation proof‐of‐concept to bending rod implants. Our method combines augmented reality with on‐device machine learning to generate and display a virtual template of the optimal rod shape without touching the instrumented anatomy. Performance was evaluated on lumbosacral spine phantoms against a pointer‐based navigation benchmark approach and ground truth data obtained from computed tomography. Results Our method achieved a mean error of 1.83 ± 1.10 mm compared to 1.87 ± 1.31 mm measured in the marker‐based approach, while only requiring 21.33 ± 8.80 s as opposed to 36.65 ± 7.49 s attained by the pointer‐based method. Conclusion Our results suggests that the combination of augmented reality and machine learning has the potential to replace conventional pointer‐based navigation in the future.
Augmented reality (AR)-based surgical navigation may offer new possibilities for safe and accurate surgical execution of complex osteotomies. In this study we investigated the feasibility of navigating the periacetabular osteotomy of Ganz (PAO), known as one of the most complex orthopedic interventions, on two cadaveric pelves under realistic operating room conditions. Preoperative planning was conducted on computed tomography (CT)-reconstructed 3D models using an in-house developed software, which allowed creating cutting plane objects for planning of the osteotomies and reorientation of the acetabular fragment. An AR application was developed comprising point-based registration, motion compensation and guidance for osteotomies as well as fragment reorientation. Navigation accuracy was evaluated on CT-reconstructed 3D models, resulting in an error of 10.8 mm for osteotomy starting points and 5.4° for osteotomy directions. The reorientation errors were 6.7°, 7.0° and 0.9° for the x-, y- and z-axis, respectively. Average postoperative error of LCE angle was 4.5°. Our study demonstrated that the AR-based execution of complex osteotomies is feasible. Fragment realignment navigation needs further improvement, although it is more accurate than the state of the art in PAO surgery.
Despite advances in intraoperative surgical imaging, reliable discrimination of critical tissue during surgery remains challenging. As a result, decisions with potentially life-changing consequences for patients are still based on the surgeon’s subjective visual assessment. Hyperspectral imaging (HSI) provides a promising solution for objective intraoperative tissue characterisation, with the advantages of being non-contact, non-ionising and non-invasive. However, while its potential to aid surgical decision-making has been investigated for a range of applications, to date no real-time intraoperative HSI (iHSI) system has been presented that follows critical design considerations to ensure a satisfactory integration into the surgical workflow. By establishing functional and technical requirements of an intraoperative system for surgery, we present an iHSI system design that allows for real-time wide-field HSI and responsive surgical guidance in a highly constrained operating theatre. Two systems exploiting state-of-the-art industrial HSI cameras, respectively using linescan and snapshot imaging technology, were designed and investigated by performing assessments against established design criteria and ex vivo tissue experiments. Finally, we report the use of our real-time iHSI system in a clinical feasibility case study as part of a spinal fusion surgery. Our results demonstrate seamless integration into existing surgical workflows.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.