2023 IEEE/ACM 45th International Conference on Software Engineering (ICSE) 2023
DOI: 10.1109/icse48619.2023.00206
|View full text |Cite
|
Sign up to set email alerts
|

An Empirical Study of Pre-Trained Model Reuse in the Hugging Face Deep Learning Model Registry

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(1 citation statement)
references
References 50 publications
0
1
0
Order By: Relevance
“…Increasing the size of the training dataset has limited benefits to the performance of trained models. Jiang et al [46] conducted the first study of pre-trained model reuse. Concretely, they interviewed practitioners from Hugging Face and identified the challenges of pre-trained model reuse, e.g., missing attributes, discrepancies, and model risks.…”
Section: Empirical Study On Ml4codementioning
confidence: 99%
“…Increasing the size of the training dataset has limited benefits to the performance of trained models. Jiang et al [46] conducted the first study of pre-trained model reuse. Concretely, they interviewed practitioners from Hugging Face and identified the challenges of pre-trained model reuse, e.g., missing attributes, discrepancies, and model risks.…”
Section: Empirical Study On Ml4codementioning
confidence: 99%