2019
DOI: 10.1007/978-3-030-32239-7_65
|View full text |Cite
|
Sign up to set email alerts
|

PFA-ScanNet: Pyramidal Feature Aggregation with Synergistic Learning for Breast Cancer Metastasis Analysis

Abstract: Automatic detection of cancer metastasis from whole slide images (WSIs) is a crucial step for following patient staging and prognosis. Recent convolutional neural network based approaches are struggling with the trade-off between accuracy and computational efficiency due to the difficulty in processing large-scale gigapixel WSIs. To meet this challenge, we propose a novel Pyramidal Feature Aggregation ScanNet (PFA-ScanNet) for robust and fast analysis of breast cancer metastasis. Our method mainly benefits fro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 19 publications
(15 citation statements)
references
References 14 publications
(28 reference statements)
0
15
0
Order By: Relevance
“…In addition, most of the methods took hundreds of minutes to run, which created a barrier to clinical application. To improve computational efficiency, Kong et al (111) and Zhao et al (112) used transfer learning to accelerate model convergence, reducing the time for a single WSI review to 5.6 and 7.2 min. Afterwards, Campanella et al (113) trained a weakly supervised learning model based on 44,732 full-slice scanned images, avoiding the manual process of extensive annotation, and obtained an AUC value of 0.965 in a test of identifying axillary lymph node metastases in breast cancer.…”
Section: Organized the Cancer Metastases In Lymphmentioning
confidence: 99%
“…In addition, most of the methods took hundreds of minutes to run, which created a barrier to clinical application. To improve computational efficiency, Kong et al (111) and Zhao et al (112) used transfer learning to accelerate model convergence, reducing the time for a single WSI review to 5.6 and 7.2 min. Afterwards, Campanella et al (113) trained a weakly supervised learning model based on 44,732 full-slice scanned images, avoiding the manual process of extensive annotation, and obtained an AUC value of 0.965 in a test of identifying axillary lymph node metastases in breast cancer.…”
Section: Organized the Cancer Metastases In Lymphmentioning
confidence: 99%
“…This structure ensures successful segmentation results by preserving local information. In digital pathology, previous studies have used the U-Net structure to segment various cancers, such as in breast [ 14 , 15 , 16 , 17 , 18 ], colon [ 19 , 20 , 21 , 22 , 23 ], lung [ 24 , 25 ], and prostate tissue [ 26 , 27 ]. Furthermore, the structure is also applied to localized tissues such as the nuclei [ 24 , 28 , 29 , 30 , 31 , 32 ], cells [ 33 , 34 ] and glands [ 19 , 20 , 21 , 22 , 23 ], which exhibit major pathological characteristics.…”
Section: Related Workmentioning
confidence: 99%
“…The grand challenge CAMELYON17 has 37 algorithms to predict at the WSI level using 899 learnable WSIs, where the top performance from the GoogleNet was reproduced based on the multi-scale and color normalization ( [20]), which with rela-tively shadow architecture and relatively longer training phase than others. After the patch level computation with CNN, this top method used the conventional machine learning and feature engineering to classify WSIs, which was surpassed by PFA-ScanNet that using more scales to extract features ( [21]), then it was surpassed by the novel attention-based classifier with a shadower siamese MI-FCN architecture ( [22]). For the limited learnable data at the patch level, weakly supervised learning is earlier employed for histopathological WSI level prediction by [23], authors developed the multiple instance learning (MIL)based method on the clinical application and achieved the better performance than pathologists with a huge 44,732 WSIs dataset.…”
Section: Related Workmentioning
confidence: 99%