2021
DOI: 10.1016/j.compbiomed.2020.104115
|View full text |Cite
|
Sign up to set email alerts
|

A scoping review of transfer learning research on medical image analysis using ImageNet

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
144
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 299 publications
(179 citation statements)
references
References 81 publications
0
144
0
1
Order By: Relevance
“…This would lead to network with poor performance if the training was simply performed on the small amount of data. Transfer learning [10], which starts with a model pretrained with a large dataset, is a popular method to address this issue, as a better starting point for the model generally will lead to better results.…”
Section: Introductionmentioning
confidence: 99%
“…This would lead to network with poor performance if the training was simply performed on the small amount of data. Transfer learning [10], which starts with a model pretrained with a large dataset, is a popular method to address this issue, as a better starting point for the model generally will lead to better results.…”
Section: Introductionmentioning
confidence: 99%
“…A fine-tuning approach, such as that of the feature extractor, utilizes a well-trained CNN model on a large dataset, such as ImageNet, as the base and supersedes the CNN layers with new CNN layers [73,74]. In fine-tuning, instead of freezing the convolution layers of the well-trained CNN model, their weights are updated during the training process [51][52][53].…”
Section: Fine-tuningmentioning
confidence: 99%
“…The main advantage of the CNN is its accuracy in image recognition; however, it involves a high computational cost and requires numerous training data [85]. A CNN generally comprises an input layer, one or many convolution layers, pooling layers, and a fully connected layer [74]. The following are the most commonly used CNN models used for transfer learning with breast ultrasound images [84].…”
Section: Convolutional Neural Networkmentioning
confidence: 99%
“…16 We initialized EfficientNet-B4 with the parameter values pretrained on ImageNet and continued training the entire model, not just the last few fully-connected layers. 3 Combining and then jointly training a small dataset of interest with a larger auxiliary dataset often help the prediction accuracy. 17,18 For the auxiliary dataset, we downloaded the publicly available SIIM-ISIC Melanoma Classification Challenge Dataset from 2018 to 2020.…”
Section: Classifiermentioning
confidence: 99%
“…For this approach to work well, the pretrained data must either be similar in type to the medical data or the size of the medical dataset must be relatively large. 3 Due to these limitations, neural network applications in healthcare have focused on relatively common conditions, where sufficiently large datasets are more readily collected. Genetic conditions, though common in aggregate, are largely individually rare.…”
Section: Introductionmentioning
confidence: 99%