2020
DOI: 10.1007/978-981-15-0146-3_123
|View full text |Cite
|
Sign up to set email alerts
|

Implementing Machine Learning on Edge Devices with Limited Working Memory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…The edge computing introduces a new computation layer physically closer to the end-users that tries to overcome the problems of cloud computing, such as communication latencies, network congestion, and security issues (Harish et al, 2020) in applications of computer vision, speech recognition, natural language processing or weather forecast. In such intelligent applications, the communication and processing are the major components, but the power consumption is the most limiting factor in edge implementations that support AI processing techniques (Ai et al, 2018;Gloria and Sebastiao, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…The edge computing introduces a new computation layer physically closer to the end-users that tries to overcome the problems of cloud computing, such as communication latencies, network congestion, and security issues (Harish et al, 2020) in applications of computer vision, speech recognition, natural language processing or weather forecast. In such intelligent applications, the communication and processing are the major components, but the power consumption is the most limiting factor in edge implementations that support AI processing techniques (Ai et al, 2018;Gloria and Sebastiao, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…This represents a great advance since these new devices have a low power consumption, are smaller and are equipped with structures capable of supporting deep learning models. These new elements are known as Edge AI devices, and they are being used in a massive way in applications of wearable devices such as [5], cyber-threats [6], implementation of machine learning algorithms in low memory devices [7], and assistant robot [8].…”
Section: Introductionmentioning
confidence: 99%