Purpose
We propose two software tools for non-rigid registration of MRI and transrectal ultrasound (TRUS) images of the prostate. Our ultimate goal is to develop an open-source solution to support MRI–TRUS fusion image guidance of prostate interventions, such as targeted biopsy for prostate cancer detection and focal therapy. It is widely hypothesized that image registration is an essential component in such systems.
Methods
The two non-rigid registration methods are: (1) a deformable registration of the prostate segmentation distance maps with B-spline regularization and (2) a finite element-based deformable registration of the segmentation surfaces in the presence of partial data. We evaluate the methods retrospectively using clinical patient image data collected during standard clinical procedures. Computation time and Target Registration Error (TRE) calculated at the expert-identified anatomical landmarks were used as quantitative measures for the evaluation.
Results
The presented image registration tools were capable of completing deformable registration computation within 5 min. Average TRE was approximately 3 mm for both methods, which is comparable with the slice thickness in our MRI data. Both tools are available under nonrestrictive open-source license.
Conclusions
We release open-source tools that may be used for registration during MRI–TRUS-guided prostate interventions. Our tools implement novel registration approaches and produce acceptable registration results. We believe these tools will lower the barriers in development and deployment of interventional research solutions and facilitate comparison with similar tools.
In surface-based registration for image-guided interventions, the presence of missing data can be a significant issue. This often arises with real-time imaging modalities such as ultrasound, where poor contrast can make tissue boundaries difficult to distinguish from surrounding tissue. Missing data poses two challenges: ambiguity in establishing correspondences; and extrapolation of the deformation field to those missing regions. To address these, we present a novel non-rigid registration method. For establishing correspondences, we use a probabilistic framework based on a Gaussian mixture model (GMM) that treats one surface as a potentially partial observation. To extrapolate and constrain the deformation field, we incorporate biomechanical prior knowledge in the form of a finite element model (FEM). We validate the algorithm, referred to as GMM-FEM, in the context of prostate interventions. Our method leads to a significant reduction in target registration error (TRE) compared to similar state-of-the-art registration algorithms in the case of missing data up to 30%, with a mean TRE of 2.6 mm. The method also performs well when full segmentations are available, leading to TREs that are comparable to or better than other surface-based techniques. We also analyze robustness of our approach, showing that GMM-FEM is a practical and reliable solution for surface-based registration.
Motivation: Spinal needle injections are technically demanding procedures. The use of ultrasound image guidance without prior CT and MR imagery promises to improve the efficacy and safety of these procedures in an affordable manner. Methodology: We propose to create a statistical shape model of the lumbar spine and warp this atlas to patient-specific ultrasound images during the needle placement procedure. From CT image volumes of 35 patients, statistical shape model of the L3 vertebra is built, including mean shape and main modes of variation. This shape model is registered to the ultrasound data by simultaneously optimizing the parameters of the model and its relative pose. Ground-truth data was established by printing 3D anatomical models of 3 patients using a rapid prototyping. CT and ultrasound data of these models were registered using fiducial markers. Results: Pairwise registration of the statistical shape model and 3D ultrasound images led to a mean target registration error of 3.4 mm, while 81% of all cases yielded clinically acceptable accuracy below the 3.5 mm threshold.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.