2019
DOI: 10.3390/cancers11121860
|View full text |Cite
|
Sign up to set email alerts
|

Automated Gleason Scoring and Tumor Quantification in Prostate Core Needle Biopsy Images Using Deep Neural Networks and Its Comparison with Pathologist-Based Assessment

Abstract: The Gleason grading system, currently the most powerful prognostic predictor of prostate cancer, is based solely on the tumor’s histological architecture and has high inter-observer variability. We propose an automated Gleason scoring system based on deep neural networks for diagnosis of prostate core needle biopsy samples. To verify its efficacy, the system was trained using 1133 cases of prostate core needle biopsy samples and validated on 700 cases. Further, system-based diagnosis results were compared with… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
67
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 42 publications
(67 citation statements)
references
References 29 publications
0
67
0
Order By: Relevance
“…Several outstanding results have recently been reported, whose grading performance is comparable to those of participating pathologists. 14 17 They are commonly based on a two-stage architecture utilizing a DL model that separately recognizes Gleason patterns 3, 4, and 5 to extract features such as pattern-wise size and likelihood, which are then fed into the Gleason grade prediction model.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Several outstanding results have recently been reported, whose grading performance is comparable to those of participating pathologists. 14 17 They are commonly based on a two-stage architecture utilizing a DL model that separately recognizes Gleason patterns 3, 4, and 5 to extract features such as pattern-wise size and likelihood, which are then fed into the Gleason grade prediction model.…”
Section: Introductionmentioning
confidence: 99%
“…To this end, researchers have manually performed region-level Gleason pattern annotation tasks on WSIs, 14 , 15 extracted diagnostic marker annotations on WSIs using computer vision techniques, 16 or employed an epithelial tissue detection model, which was developed using immunohistochemistry-stained tissue slide images. 17 All of these techniques involve large manual annotation costs and/or the development of complex algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…An expert pathologist (CAK) annotated prostate cancer on all digital histopathology images on a per‐pixel basis. Additionally, we used the deep learning method developed by Ryu et al 28 to predict pixel‐level Gleason pattern on our histopathology dataset, which was then registered to MRI to create labels for Gleason patterns 3, 4, and 5 for the radical prostatectomy patients in cohort C1. The annotated histopathology images were then registered to MRI, and the pixel‐level labels of aggressive and indolent cancers from histopathology images were mapped onto MRI.…”
Section: Methodsmentioning
confidence: 99%
“… 11 Here, we seek to expand upon this work by focusing on distinguishing aggressive from indolent cancers on MRI using labels derived from automated registration of histopathology and MR images. Unlike prior methods that either use radiologist labels or pathology labels mapped from cognitive alignment (radiologists and pathologists jointly reviewing the MR and histopathology images, without computational alignment), our proposed approach is the first to use automatically detected aggressive and indolent cancers on histopathology images 28 mapped onto MRI to generate labels for aggressive and indolent cancers on MRI.…”
Section: Introductionmentioning
confidence: 99%
“…Today, deep learning methods are increasingly being used for biomedical image analysis and are widely available to researchers. Deep learning has already been applied for tumor grading and classification 12,13 and grading osteoarthritis with micro-computed tomography 14 .…”
Section: Introductionmentioning
confidence: 99%