Proceedings of the Twenty-Fourth International Conference on Architectural Support for Programming Languages and Operating Syst 2019
DOI: 10.1145/3297858.3304011
|View full text |Cite
|
Sign up to set email alerts
|

Intelligence Beyond the Edge

Abstract: Energy-harvesting technology provides a promising platform for future IoT applications. However, since communication is very expensive in these devices, applications will require inference "beyond the edge" to avoid wasting precious energy on pointless communication. We show that application performance is highly sensitive to inference accuracy. Unfortunately, accurate inference requires large amounts of computation and memory, and energy-harvesting systems are severely resource-constrained. Moreover, energy-h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 151 publications
(21 citation statements)
references
References 64 publications
0
13
0
Order By: Relevance
“…2.1.2 Edge Inference Runtimes. Intermittent edge inference [21] describes how to optimize edge inference for energy use on edge devices, but focuses on compression and pruning of model layers with a specialized inference runtime. Jupiter [20] orchestrates execution of a task on geographically distributed compute nodes based on a given task graph.…”
Section: Edge Inferencementioning
confidence: 99%
“…2.1.2 Edge Inference Runtimes. Intermittent edge inference [21] describes how to optimize edge inference for energy use on edge devices, but focuses on compression and pruning of model layers with a specialized inference runtime. Jupiter [20] orchestrates execution of a task on geographically distributed compute nodes based on a given task graph.…”
Section: Edge Inferencementioning
confidence: 99%
“…After a system-specific amount of energy accumulates, hardware activates the system to begin executing, quickly consuming the energy (green segments). The executing system may collect sensor inputs, run computations (e.g., machine learning to process sensor data [16,17,38]) on an ultra-low-power CPU or microcontroller, and log or transmit results via a wireless radio link.…”
Section: The Basics Of Intermittent Computingmentioning
confidence: 99%
“…To enable true wide-spread integration of AI sensors for low and ultra-low power edge inference there is a need for focused design effort towards delivering energy efficient and memory frugal implementations [5], [6], [7]. The prevailing approach to edge inference is based on Neural Network (NN) models [5], [8].…”
Section: Introductionmentioning
confidence: 99%
“…To enable true wide-spread integration of AI sensors for low and ultra-low power edge inference there is a need for focused design effort towards delivering energy efficient and memory frugal implementations [5], [6], [7]. The prevailing approach to edge inference is based on Neural Network (NN) models [5], [8]. However, for a given inference application, designers are forced to choose intelligent trade-offs through hardware-software co-design considerations as the deep neural network (DNN) models are resource hungry in terms of storage, runtime memory (RAM) and computation.…”
Section: Introductionmentioning
confidence: 99%