2022
DOI: 10.1002/eng2.12595
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic fine‐tuning layer selection using KullbackLeibler divergence

Abstract: The selection of layers in the transfer learning fine‐tuning process ensures a pre‐trained model's accuracy and adaptation in a new target domain. However, the selection process is still manual and without clearly defined criteria. If the wrong layers in a neural network are selected and used, it could lead to poor accuracy and model generalization in the target domain. This paper introduces the use of Kullback–Leibler divergence on the weight correlations of the model's convolutional neural network layers. Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 62 publications
0
2
0
Order By: Relevance
“…This proposed work builds on two previous works by Wanjiku, Nderu & Kimwele (2022) . The first work looks into selecting relevant data points using textural features.…”
Section: Related Workmentioning
confidence: 86%
“…This proposed work builds on two previous works by Wanjiku, Nderu & Kimwele (2022) . The first work looks into selecting relevant data points using textural features.…”
Section: Related Workmentioning
confidence: 86%
“…After selecting the best block, it is fully fine-tuned. "DKL" [59] uses the Kullback-Leibler divergence on weight correlations to identify the best layers for fine-tuning. Some techniques use an additional policy network for routing prediction: "Spot-Tune" [23] is based on the view that ResNet is an ensemble of shallow networks [69].…”
Section: Layer Routingmentioning
confidence: 99%