Image guided therapy is a natural concept and commonly used in medicine. In anesthesia, a common task is the injection of an anesthetic close to a nerve under freehand ultrasound guidance. Several guidance systems exist using electromagnetic tracking of the ultrasound probe as well as the needle, providing the physician with a precise projection of the needle into the ultrasound image. This, however, requires additional expensive devices. We suggest using optical tracking with miniature cameras attached to a 2D ultrasound probe to achieve a higher acceptance among physicians. The purpose of this paper is to present an intuitive method to calibrate freehand ultrasound needle guidance systems employing a rigid stereo camera system. State of the art methods are based on a complex series of error prone coordinate system transformations which makes them susceptible to error accumulation. By reducing the amount of calibration steps to a single calibration procedure we provide a calibration method that is equivalent, yet not prone to error accumulation. It requires a linear calibration object and is validated on three datasets utilizing different calibration objects: a 6mm metal bar and a 1:25mm biopsy needle were used for experiments. Compared to existing calibration methods for freehand ultrasound needle guidance systems, we are able to achieve higher accuracy results while additionally reducing the overall calibration complexity
C-arm fluoroscopy is used for guidance during several clinical exams, e.g. in bronchoscopy to locate the bronchoscope inside the airways. Unfortunately, these images provide only 2D information. However, if the C-arm pose is known, it can be used to overlay the intrainterventional fluoroscopy images with 3D visualizations of airways, acquired from preinterventional CT images. Thus, the physician's view is enhanced and localization of the instrument at the correct position inside the bronchial tree is facilitated. We present a novel method for C-arm pose estimation introducing a marker-based pattern, which is placed on the patient table. The steel markers form a pattern, allowing to deduce the C-arm pose by use of the projective invariant cross-ratio. Simulations show that the C-arm pose estimation is reliable and accurate for translations inside an imaging area of 30 cm x 50 cm and rotations up to 30°. Mean error values are 0:33 mm in 3D space and 0:48 px in the 2D imag ing plane. First tests on C-arm images resulted in similarly compelling accuracy values and high reliability in an imaging area of 30 cm x 42:5 cm. Even in the presence of interfering structures, tested both with anatomy phantoms and a turkey cadaver, high success rates over 90% and fully satisfying execution times below 4 sec for 1024 px x 1024 px images could be achieved
Motion tracking for head motion compensation in MRI has been a research topic for several years. However, literature is not giving much attention to the calibration of such setups. We present a method to calibrate the coordinate systems of a stereo-optical camera setup mounted to the MRI head coil. Though using a simple setup and visible instead of infrared light for tracking, it is possible to achieve a sub-millimeter tracking precision. Blue water-filled spheres are positioned throughout the whole MRI imaging volume and detected in images of the tracking cameras as well as MRI scans. In order to register the coordinate systems of both camera system and MRI scanner, a heuristic-enhanced brute-force approach is used to match detected spheres in the different images. Then, a rigid transformation is calculated and applied to the cameras' external parameters to align the coordinate systems. The precision of our setup was evaluated using leave-one-out cross validation both for the camera calibration and the scanner coordinate system registration. We found that the cameras' locations and orientations are correct within 0:03mm and 0:03°, using a number of 45 spheres. Evaluation of the MRI coordinate system registration showed an average reprojection error of 1:1 mm. Influence of a feature point jitter of 0:5 px is 0:03mm for a point close to the cameras and 0:3mm for a point close to the back of the patient's head. Tracked poses are correct within 0:17mm and 0:001°
This paper describes a perineal access tool for MRI-guided prostate interventions and evaluates it using a phantom study. The development of this device has been driven by the clinical need and a close collaboration effort. The device seamlessly fits into the workflow of MRI-guided prostate procedures such as cryoablation and biopsies. It promises a significant cut in the procedure time, accurate needle placement, lower number of insertions, and a potential for better patient outcomes. The current embodiment includes a frame which is placed next to the perineum and incorporates both visual and MRI-visible markers. These markers are automatically detected both in MRI and by a pair of stereo cameras (optical head) allowing for automatic optical registration. The optical head illuminates the procedure area and can track instruments and ultrasound probes. The frame has a window to access the perineum. Multiple swappable grids may be placed in this window depending on the application. It is also possible to entirely remove the grid for freehand procedures. All the components are designed to be used inside the MRI suite. To test this system, we built a custom phantom with MRI visible targets and planned 21 needle insertions with three grid types using the SCENERGY software. With an average insertion depth of about 85 mm, the average error of needle tip placement was 2.74 mm. We estimated the error by manually segmenting the needle tip in post-insertion MRIs of the phantom and comparing that to the plan.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.