Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling 2019
DOI: 10.1117/12.2512282
|View full text |Cite
|
Sign up to set email alerts
|

A semiautomatic approach for prostate segmentation in MR images using local texture classification and statistical shape modeling

Abstract: Segmentation of the prostate in magnetic resonance (MR) images has many applications in imageguided treatment planning and procedures such as biopsy and focal therapy. However, manual delineation of the prostate boundary is a time-consuming task with high inter-observer variation. In this study, we proposed a semiautomated, three-dimensional (3D) prostate segmentation technique for T2-weighted MR images based on shape and texture analysis. The prostate gland shape is usually globular with a smoothly curved sur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 32 publications
1
3
0
Order By: Relevance
“…In our experiments, 14 images were segmented by a second expert, and the inter-expert agreement was found to be of 0.8794 in terms of Sørensen-Dice Similarity Coefficient (DSC) and 1.5619 mm in terms of the Average Boundary Distance (ABD). Very similar results were obtained in [3], with experts achieving a DSC and an ABD of 0.83 and 1.5 mm, respectively. Because of this, automatic segmentation algorithms for the prostate are increasingly sought-after.…”
Section: Introductionsupporting
confidence: 77%
See 1 more Smart Citation
“…In our experiments, 14 images were segmented by a second expert, and the inter-expert agreement was found to be of 0.8794 in terms of Sørensen-Dice Similarity Coefficient (DSC) and 1.5619 mm in terms of the Average Boundary Distance (ABD). Very similar results were obtained in [3], with experts achieving a DSC and an ABD of 0.83 and 1.5 mm, respectively. Because of this, automatic segmentation algorithms for the prostate are increasingly sought-after.…”
Section: Introductionsupporting
confidence: 77%
“…Despite multiple attempts at incorporating a similar loss to our model, we finally decided against it, since it did not provide any performance advantages during validation. Therefore, the finally used loss function L is directly derived from DSC, as illustrated in Equation (3).…”
Section: Evaluation Metrics and Lossmentioning
confidence: 99%
“…These results suggest that the use of 15-20 sparse manually selected surface points achieves a segmentation performance close to manual segmentation. According to our previous studies 13,14 , manual selection of 12 prostate surface points on both MRI and CT images could be done within 20 seconds, which is considerably shorter than the average time of manual prostate MRI segmentation, as reported in the literature 15,16 . Therefore, we think minimal user interaction could be helpful to improve the segmentation accuracy significantly.…”
Section: Testing Resultsmentioning
confidence: 85%
“…Therefore, an observer study is required to confirm the effectiveness of this approach for prostate segmentation in a real clinical situation. In addition, although our previous studies 13,14,17 showed that using minimal user interaction for prostate segmentation could substantially speed up the process when compared to manual segmentation time, the observer study must confirm that for this study.…”
Section: Limitationsmentioning
confidence: 72%
“…These datasets were resampled to the same target spacing (2, 2, 2) and embedded into a 256 × 256 × 256 3D volumetric space [35]. After normalizing and window leveling [−200, 250] [36][37][38][39], to enhance the contrast and texture of soft tissue, the foreground of input voxels was selected from the background by an intersection with mask voxels images using MATLAB R2022a. To increase the amount of data for training the network, we augmented the CT images…”
Section: Patient Cohorts and Data Pre-processingmentioning
confidence: 99%