2021 Data Compression Conference (DCC) 2021
DOI: 10.1109/dcc50243.2021.00045
|View full text |Cite
|
Sign up to set email alerts
|

Research on Knowledge Distillation of Generative Adversarial Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
56
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(59 citation statements)
references
References 0 publications
1
56
0
Order By: Relevance
“…Wang et al [121] and Xu et al [122] combine KD with GAN. Uijlings et al [123] proposed that a group of source classes with boundary box annotations be used to revisit the knowledge transfer of training object detectors on the target class of weakly supervised training images.…”
Section: E Other Kd Methodsmentioning
confidence: 99%
“…Wang et al [121] and Xu et al [122] combine KD with GAN. Uijlings et al [123] proposed that a group of source classes with boundary box annotations be used to revisit the knowledge transfer of training object detectors on the target class of weakly supervised training images.…”
Section: E Other Kd Methodsmentioning
confidence: 99%
“…An intuitive solution is to use the Kullback-Leibler divergence or p -loss when the knowledge falls on the soft logit [12,20] or intermediate representation [29,40]. Beyond that, Wang et al [35] utilized the adversarial training scheme in generative adversarial networks (GANs) [6] to transfer knowledge. Jang et al [16] considered meta-learning to selectively transfer knowledge.…”
Section: Related Workmentioning
confidence: 99%
“…They validated their findings by running on MNIST dataset and, JFT dataset by google and other speech recognition tasks. Since then, knowledge distillation has progressed a lot, and adversarial methods [17,18] also have utilized for modelling knowledge transfer between teacher and student. After this study, extensive research has conducted on knowledge distillation.…”
Section: Related Studymentioning
confidence: 99%