2022
DOI: 10.1007/978-3-031-19830-4_17
|View full text |Cite
|
Sign up to set email alerts
|

Not All Models Are Equal: Predicting Model Transferability in a Self-challenging Fisher Space

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…Model Similarity-based Methods DSE (Vu et al, 2020) ϕ(x), ψ(x) ✓ ✗ ✗ DDS (Dwivedi et al, 2020) ϕ(x), ψ(x) ✓ ✗ ✗ Training-free Methods MSC (Meiseles and Rokach, 2020) ϕ(x), y ✗ ✗ ✓ kNN (Puigcerver et al, 2021) ϕ(x), y ✗ ✗ ✓ PARC (Bolya et al, 2021) ϕ(x), y ✗ ✗ ✓ GBC ϕ(x), y ✗ ✗ ✓ Logistic (Kumari et al, 2022) ϕ(x), y ✗ ✓ ✓ H-score (Bao et al, 2019) ϕ(x), y ✗ ✗ ✓ Reg. H-score (Ibrahim et al, 2022) ϕ(x), y ✗ ✗ ✓ N LEEP ϕ(x), y ✗ ✗ ✓ TransRate (Huang et al, 2022) ϕ(x), y ✗ ✗ ✓ LogME (You et al, 2021) ϕ(x), y ✗ ✓ ✓ SFDA (Shao et al, 2022) ϕ(x), y ✗ ✓ ✓ PACTran (Ding et al, 2022) ϕ(x), y ✗ ✓ ✓ target classes by LR's test accuracy.…”
Section: Free Of Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…Model Similarity-based Methods DSE (Vu et al, 2020) ϕ(x), ψ(x) ✓ ✗ ✗ DDS (Dwivedi et al, 2020) ϕ(x), ψ(x) ✓ ✗ ✗ Training-free Methods MSC (Meiseles and Rokach, 2020) ϕ(x), y ✗ ✗ ✓ kNN (Puigcerver et al, 2021) ϕ(x), y ✗ ✗ ✓ PARC (Bolya et al, 2021) ϕ(x), y ✗ ✗ ✓ GBC ϕ(x), y ✗ ✗ ✓ Logistic (Kumari et al, 2022) ϕ(x), y ✗ ✓ ✓ H-score (Bao et al, 2019) ϕ(x), y ✗ ✗ ✓ Reg. H-score (Ibrahim et al, 2022) ϕ(x), y ✗ ✗ ✓ N LEEP ϕ(x), y ✗ ✗ ✓ TransRate (Huang et al, 2022) ϕ(x), y ✗ ✗ ✓ LogME (You et al, 2021) ϕ(x), y ✗ ✓ ✓ SFDA (Shao et al, 2022) ϕ(x), y ✗ ✓ ✓ PACTran (Ding et al, 2022) ϕ(x), y ✗ ✓ ✓ target classes by LR's test accuracy.…”
Section: Free Of Trainingmentioning
confidence: 99%
“…There are also some metrics that involve fine-tuning dynamics. SFDA (Shao et al, 2022) simulates the dynamics by projecting the pre-trained features using Fisher Discriminant Analysis (FDA) to increase the class separability. Then, it approximates the log-likelihood by Bayes classification over projected features and also adds a self-challenging module to further measure the ability of the pre-trained models on hard samples.…”
Section: Free Of Trainingmentioning
confidence: 99%