2020
DOI: 10.1007/978-3-030-58583-9_12
|View full text |Cite
|
Sign up to set email alerts
|

SemifreddoNets: Partially Frozen Neural Networks for Efficient Computer Vision Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…,6 to focus on situations with few annotations. We also investigate whether freezing the learned parameters f Θ is a viable option, as this reduces the training effort and the number of required annotated samples [27]. Since 2% available annotations are difficult but not impossible to solve if the pretext task provides good representations, we choose to select exemplary results for each pretext task for this annotation rate.…”
Section: ) Pretext Comparisonmentioning
confidence: 99%
“…,6 to focus on situations with few annotations. We also investigate whether freezing the learned parameters f Θ is a viable option, as this reduces the training effort and the number of required annotated samples [27]. Since 2% available annotations are difficult but not impossible to solve if the pretext task provides good representations, we choose to select exemplary results for each pretext task for this annotation rate.…”
Section: ) Pretext Comparisonmentioning
confidence: 99%
“…Moreover, Isikogan et al [35] introduced a novel approach for freezing specific parameters within each layer, substituting multipliers with fixed scalers and replacing the network with optimized full-pipeline hardware blocks. The proposed network organization offers a balance between flexibility and cost because the weights can be configured at various scales and levels of abstraction, distinguishing it from conventional layer-by-layer freezing technique.…”
Section: Weight Transfer and Freezingmentioning
confidence: 99%