2022
DOI: 10.1109/ojsscs.2022.3219034
|View full text |Cite
|
Sign up to set email alerts
|

Energy-Efficient DNN Training Processors on Micro-AI Systems

Abstract: Many edge/mobile devices are now able to utilize deep neural networks (DNNs) thanks to the development of mobile DNN accelerators. Mobile DNN accelerators overcame the problems of limited computing resources and battery capacity by realizing energy-efficient inference. However, its passive behavior makes it difficult for DNN to provide active customization for individual users or its service environment. The importance of onchip training is rising more and more to provide active interaction between DNN process… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
references
References 68 publications
0
0
0
Order By: Relevance