2020
DOI: 10.1109/jiot.2020.2979523
|View full text |Cite
|
Sign up to set email alerts
|

Energy-Efficient Processing and Robust Wireless Cooperative Transmission for Edge Inference

Abstract: Edge machine learning can deliver low-latency and private artificial intelligent (AI) services for mobile devices by leveraging computation and storage resources at the network edge. This paper presents an energy-efficient edge processing framework to execute deep learning inference tasks at the edge computing nodes whose wireless connections to mobile devices are prone to channel uncertainties. Aimed at minimizing the sum of computation and transmission power consumption with probabilistic quality-of-service … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 42 publications
(31 citation statements)
references
References 39 publications
0
31
0
Order By: Relevance
“…Although (15) does not violate the group sparsity structure of v nk , it indicates that the receive beamforming vectors do not contribute to the task selection. Based on the uplink-downlink duality, in the following, we shall propose a virtual downlink formulation to overcome the scaling issue.…”
Section: Block-structured Optimization Approachmentioning
confidence: 96%
See 2 more Smart Citations
“…Although (15) does not violate the group sparsity structure of v nk , it indicates that the receive beamforming vectors do not contribute to the task selection. Based on the uplink-downlink duality, in the following, we shall propose a virtual downlink formulation to overcome the scaling issue.…”
Section: Block-structured Optimization Approachmentioning
confidence: 96%
“…As it is infeasible to run DNN models on resource-constrained MDs, we in this paper propose to perform inference tasks for the MDs at the BSs. We assume that all the BSs have downloaded the pre-trained DNN models from cloud servers in advance [15].…”
Section: A System Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Another promising direction is to offload the training and inference processes to the edge (i.e., edge intelligence in the literature), i.e., AI techniques are employed at the network edge to intelligently process radio signals. For example [184], the joint task allocation and downlink beamforming problem was optimized to minimize the total energy consumption. Applications of a 6G technology, namely reconfigurable intelligent surface, for edge inference, was reported [185].…”
Section: Model Compression and Accelerationmentioning
confidence: 99%
“…Offloading can be used to optimize network performance, for example by minimizing communication power consumption in wireless networks [128]. Unlike most similar approaches, this framework uses statistical learning, specifically iteratively reweighted L1 minimization with difference-ofconvex functions regularization.…”
Section: E Networkingmentioning
confidence: 99%