2021
DOI: 10.3390/electronics10161912
|View full text |Cite
|
Sign up to set email alerts
|

Best Practices for the Deployment of Edge Inference: The Conclusions to Start Designing

Abstract: The number of Artificial Intelligence (AI) and Machine Learning (ML) designs is rapidly increasing and certain concerns are raised on how to start an AI design for edge systems, what are the steps to follow and what are the critical pieces towards the most optimal performance. The complete development flow undergoes two distinct phases; training and inference. During training, all the weights are calculated through optimization and back propagation of the network. The training phase is executed with the use of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 88 publications
0
2
0
Order By: Relevance
“…The larger the DL model the more parameters it will have, and consequently the more memory space (in RAM) required to host the model. Model size or memory footprint is computed having 'MB' as their unit of measurement [243,244,245,246,247]. For a specific image classification problem, if MobileNet V2 with 3.54 million parameters is selected, it will have 14 MB as model size whereas if InceptionV4 with 42.74 million parameters is selected for the same problem, it will have a 163 MB model size requirement [242].…”
Section: H Memory Footprint/ Model Sizementioning
confidence: 99%
“…The larger the DL model the more parameters it will have, and consequently the more memory space (in RAM) required to host the model. Model size or memory footprint is computed having 'MB' as their unit of measurement [243,244,245,246,247]. For a specific image classification problem, if MobileNet V2 with 3.54 million parameters is selected, it will have 14 MB as model size whereas if InceptionV4 with 42.74 million parameters is selected for the same problem, it will have a 163 MB model size requirement [242].…”
Section: H Memory Footprint/ Model Sizementioning
confidence: 99%
“…As an EDGE server usually has limited infrastructure resources, it becomes challenging to host a DNN model because of the computational requirements (the bigger the network, the more parameters it will have and each extra parameter increases the memory requirement (in RAM)). Model size or memory footprint is computed having 'MB' as their unit of measurement [35,61,139,167,260]. For the image classification problem, if MobileNet V2 with 3.54 million parameters is selected it will have 14 MB as model size whereas if InceptionV4 with 42.74 million parameters is selected for the same problem it will have a 163 MB model size requirement [133].…”
Section: Memory Footprint/ Model Sizementioning
confidence: 99%