Proceedings of the 5th International Workshop on Embedded and Mobile Deep Learning 2021
DOI: 10.1145/3469116.3470015
|View full text |Cite
|
Sign up to set email alerts
|

Enabling Binary Neural Network Training on the Edge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…Another training direction for low memory BNN that is suitable for on-edge devices is proposed in [36] that introduced a low memory and low energy training. J. Laydevant et al [38] introduced training the BNN utilizing Equilibrium Propagation (EP) that provides the ability of on-chip training.…”
Section: ) Other Training Methodsmentioning
confidence: 99%
“…Another training direction for low memory BNN that is suitable for on-edge devices is proposed in [36] that introduced a low memory and low energy training. J. Laydevant et al [38] introduced training the BNN utilizing Equilibrium Propagation (EP) that provides the ability of on-chip training.…”
Section: ) Other Training Methodsmentioning
confidence: 99%
“…In addition, this work does not consider data formats other than floating-point and fixed-point formats. Other research has shown good potential for block floating-point [26], [39]- [42], posits [43]- [45], logarithmic number systems [46]- [49] and even binary formats [50]. The investigation of mixed data formats DNN training including those data types, especially for block mini-floats [41], is left for future work.…”
Section: Limitationsmentioning
confidence: 99%