2024
DOI: 10.1109/tcbb.2023.3272333
|View full text |Cite
|
Sign up to set email alerts
|

Explainable Knowledge Distillation for On-Device Chest X-Ray Classification

Abstract: Automated multi-label chest X-rays (CXR) image classification has achieved substantial progress in clinical diagnosis via utilizing sophisticated deep learning approaches. However, most deep models have high computational demands, which makes them less feasible for compact devices with low computational requirements. To overcome this problem, we propose a knowledge distillation (KD) strategy to create the compact deep learning model for the real-time multi-label CXR image classification. We study different alt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
1
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 42 publications
0
0
0
Order By: Relevance
“…To achieve this goal, we studied the sensitivity to a dataset imbalance of the following contemporary neural networks: Xception, ViT-384 [32], ViT-224, VGG19, ResNet34 [33], ResNet50, ResNet101 [34], Inception_v3, DenseNet201 [35], DenseNet161 [36], and DeIT [37]. Different imbalance reduction techniques and their ensembles were used to determine this sensitivity.…”
Section: Introductionmentioning
confidence: 99%
“…To achieve this goal, we studied the sensitivity to a dataset imbalance of the following contemporary neural networks: Xception, ViT-384 [32], ViT-224, VGG19, ResNet34 [33], ResNet50, ResNet101 [34], Inception_v3, DenseNet201 [35], DenseNet161 [36], and DeIT [37]. Different imbalance reduction techniques and their ensembles were used to determine this sensitivity.…”
Section: Introductionmentioning
confidence: 99%