Medical Imaging 2020: Digital Pathology 2020
DOI: 10.1117/12.2548571
|View full text |Cite
|
Sign up to set email alerts
|

A systematic comparison of deep learning strategies for weakly supervised Gleason grading

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1
1

Relationship

3
6

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 0 publications
0
9
0
Order By: Relevance
“…The nuclear contours are available in the form of manual annotations only for the PanNuke data. Automated contours of the nuclei in the Camelyon images are obtained by the multi-instance deep segmentation model in Otálora et al (2020). This model is a Mask R-CNN model (He et al, 2017), fine-tuned from ImageNet weights on the Kumar dataset for the nuclei segmentation task (Kumar et al, 2017).…”
Section: Configuration Of the Extra Targetsmentioning
confidence: 99%
“…The nuclear contours are available in the form of manual annotations only for the PanNuke data. Automated contours of the nuclei in the Camelyon images are obtained by the multi-instance deep segmentation model in Otálora et al (2020). This model is a Mask R-CNN model (He et al, 2017), fine-tuned from ImageNet weights on the Kumar dataset for the nuclei segmentation task (Kumar et al, 2017).…”
Section: Configuration Of the Extra Targetsmentioning
confidence: 99%
“…The proposed method is comparable against works that use strong supervision via pixel-level annotations, i.e., Arvaniti et al [25] (κ = 0.75 for Gleason scoring) and Bulten at al. [31] (κ = 0.85 for Grade Group scoring), as well as works that use only global biopsy-level labels, i.e., Strom et al [30] (κ = 0.91 for Grade Group scoring) and Otlora et al [38] (κ = 0.44 for Grade Group scoring). In accordance with the observations in our work, methods based on the Gleason grades proportion in the tissue suffer a performance drop on external datasets (κ = 0.72 in Bulten et al Representative examples of the obtained results using the CAD system on the external SICAP dataset are presented in Fig.…”
Section: ) Grading Of Local Patternsmentioning
confidence: 99%
“…Deep learning models (e.g., CNNs) form a specific approach that is directly applied on images to extract, select features, and predict the class (classification) or a value (regression) in an automated fashion. Examples in the PCa literature have observed that this deep learning approach detects malignant lesions [ 125 ], predicts the GS [ 126 ], and segments the ROI [ 127 , 128 ].…”
Section: Radiomics Pipeline For Predicting Tumor Gradementioning
confidence: 99%