Accurate segmentation of organs-at-risk is important inprostate cancer radiation therapy planning. However, poor soft tissue contrast in CT makes the segmentation task very challenging. We propose a deep convolutional neural network approach to automatically segment the prostate, bladder, and rectum from pelvic CT. A hierarchical coarse-to-fine segmentation strategy is used where the first step generates a coarse segmentation from which an organ-specific region of interest (ROI) localization map is produced. The second step produces detailed and accurate segmentation of the organs. The ROI localization map is generated using a 3D U-net. The localization map helps adjusting the ROI of each organ that needs to be segmented and hence improves computational efficiency by eliminating irrelevant background information. For the fine segmentation step, we designed a fully convolutional network (FCN) by combining a generative adversarial network (GAN) with a U-net. Specifically, the generator is a 3D U-net that is trained to predict individual pelvic structures, and the discriminator is an FCN which fine-tunes the generator predicted segmentation map by comparing it with the ground truth. The network was trained using 100 CT datasets and tested on 15 datasets to segment the prostate, bladder and rectum. The average Dice similarity (mean±SD) of the prostate, bladder and rectum are 0.90±0.05, 0.96±0.06 and 0.91±0.09, respectively, and Hausdorff distances of these three structures are 5.21±1.17, 4.37±0.56 and 6.11±1.47(mm), respectively. The proposed method produces accurate and reproducible segmentation of pelvic structures, which can be potentially valuable for prostate cancer radiotherapy treatment planning.
This paper presents a segmentation technique to identify the medial axis and the boundary of cranial nerves. We utilize a 3-D deformable one-simplex discrete contour model to extract the medial axis of each cranial nerve. This contour model represents a collection of two-connected vertices linked by edges, where vertex position is determined by a Newtonian expression for vertex kinematics featuring internal and external forces, the latter of which include attractive forces toward the nerve medial axis. We exploit multiscale vesselness filtering and minimal path techniques in the medial axis extraction method, which also computes a radius estimate along the path. Once we have the medial axis and the radius function of a nerve, we identify the nerve surface using a two-simplex deformable model, which expands radially and can accommodate any nerve shape. As a result, the method proposed here combines the benefits of explicit contour and surface models, while also achieving a cornerstone for future work that will emphasize shape statistics, static collision with other critical structures, and tree-shape analysis.
In this paper, a deformable registration method is proposed that enables automatic alignment of preoperative PET/CT to intraoperative ultrasound in order to achieve PET-determined focal prostate brachytherapy. Novel PET imaging agents such as prostate specific membrane antigen (PSMA) enables highly accurate identification of intra/extra-prostatic tumors. Incorporation of PSMA PET into the standard transrectal ultrasound (TRUS)-guided prostate brachytherapy will enable focal therapy, thus minimizing radiation toxicities. Our registration method requires PET/CT and TRUS volume as well as prostate segmentations. These input volumes are first rigidly registered by maximizing spatial overlap between the segmented prostate volumes, followed by the deformable registration. To achieve anatomically accurate deformable registration, we extract anatomical landmarks from both prostate boundary and inside the gland. Landmarks are extracted along the base-apex axes using two approaches: equiangular and equidistance. Three-dimensional thin-plate spline (TPS)-based deformable registration is then performed using the extracted landmarks as control points. Finally, the PET/CT images are deformed to the TRUS space by using the computed TPS transformation. The proposed method was validated on 10 prostate cancer patient datasets in which we registered post-implant CT to end-of-implantation TRUS. We computed target registration errors (TREs) by comparing the implanted seed positions (transformed CT seeds vs. intraoperatively identified TRUS seeds). The average TREs of the proposed method are 1.98±1.22 mm (mean±standard deviation) and 1.97±1.24 mm for equiangular and equidistance landmark extraction methods, respectively, which is better than or comparable to existing state-of-the-art methods while being computationally more efficient with an average computation time less than 40 seconds.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.