2023
DOI: 10.1109/jiot.2022.3229374
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Federated Learning for AIoT Applications Using Knowledge Distillation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(2 citation statements)
references
References 15 publications
0
0
0
Order By: Relevance
“…Another optimization approach is knowledge distillation, which transfers knowledge from a complex to a simpler model and has been used successfully for many models and applications [22], [23]. These model optimizations are useful to enable the use of ML in IoT devices; however, in most cases, the energy per inference still limits the number of inferences that can be made.…”
Section: A Machine Learning Optimizationsmentioning
confidence: 99%
“…Another optimization approach is knowledge distillation, which transfers knowledge from a complex to a simpler model and has been used successfully for many models and applications [22], [23]. These model optimizations are useful to enable the use of ML in IoT devices; however, in most cases, the energy per inference still limits the number of inferences that can be made.…”
Section: A Machine Learning Optimizationsmentioning
confidence: 99%
“…На рисунке 1 представлена данная архитектура. В статье [7] продемонстрирован подход, который использует Knowledge Distillation (KD). В этом подходе устройства обучают локальные модели на своих данных и отправляют на сервер не только градиенты моделей, но и мягкие цели (soft targets), которые представляют собой вероятности предсказаний для каждого класса.…”
Section: централизованная архитектураunclassified