2018
DOI: 10.1007/978-3-030-00937-3_73
|View full text |Cite
|
Sign up to set email alerts
|

Densely Deep Supervised Networks with Threshold Loss for Cancer Detection in Automated Breast Ultrasound

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 25 publications
(20 citation statements)
references
References 10 publications
0
20
0
Order By: Relevance
“…Ultrasound imaging is a standard modality for many diagnostic and monitoring purposes, and there has been significant research into developing automatic methods for segmentation of ultrasound images [5,11]. U-Net [7] for instance has been shown to be a fast and precise solution for medical image segmentation, and has successfully been adapted to segment ultrasound images too [10,1,8,12]. In this study, we investigate the effect of fine-tuning different layers of a U-Net network for the application of ultrasound image segmentation.…”
Section: Introductionmentioning
confidence: 99%
“…Ultrasound imaging is a standard modality for many diagnostic and monitoring purposes, and there has been significant research into developing automatic methods for segmentation of ultrasound images [5,11]. U-Net [7] for instance has been shown to be a fast and precise solution for medical image segmentation, and has successfully been adapted to segment ultrasound images too [10,1,8,12]. In this study, we investigate the effect of fine-tuning different layers of a U-Net network for the application of ultrasound image segmentation.…”
Section: Introductionmentioning
confidence: 99%
“…We further extensively compared our network with the cutting-edge networks, including 3D FCN, 29 Vanilla Unet, 19 Residual Unet, 30 and our previous breast cancer detection (BCD) model. 17 Figure 8 illustrates the free-response receiver operating characteristic curves (FROCs) of the proposed network and compared models. Table I reports the numerical results of sensitivities, FPs per ABUS volume, IoU, and Cen-Dis for different networks.…”
Section: D Detection Performancementioning
confidence: 99%
“…Experimental results show the asymmetric loss outperformed CE loss and DSC loss with respect to detection sensitivity, IoU, and CenDis, which demonstrates the asymmetric loss contributed to the cancer detection. Figure 10 further shows the volume distribution of all cancerous regions, and their corresponding detection sensitivities achieved by BCD model 17 and our proposed network, respectively. It is shown in Fig.…”
Section: D Detection Performancementioning
confidence: 99%
“…To the best of our knowledge, only two methods have been developed for lesion detection in ABUS images. As the first attempt, Wang et al (2018) utilized a 3-D U-Net basically introduced in (Ronneberger et al 2015) as their backbone architecture and improved its performance by some modifications. Due to limited number of ABUS training samples as well as avoiding over-fitting issue, they used a pre-trained 3-D convolutional network (Tan et al 2015) and fine-tuned the hyper-parameters of the network.…”
Section: Deep Learning Based Approachesmentioning
confidence: 99%