2019
DOI: 10.1109/jproc.2019.2941458
|View full text |Cite
|
Sign up to set email alerts
|

Wireless Network Intelligence at the Edge

Abstract: edge devices. The new breed of intelligent devices and high-stake applications (drones, augmented/virtual reality, autonomous systems, and so on) requires a novel paradigm change calling for distributed, low-latency and reliable ML at the wireless network edge (referred to as edge ML). In edge ML, training data are unevenly distributed over a large number of edge nodes, which have access to a tiny fraction of the data. Moreover, training and inference are carried out collectively over wireless links, where edg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
434
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 546 publications
(435 citation statements)
references
References 123 publications
(142 reference statements)
1
434
0
Order By: Relevance
“…Next-generation computing networks will encounter a paradigm shift from a conventional cloud computing setting, which aggregates computational resources in a data center, to edge computing systems which largely deploy computational power to the network edges to meet the needs of applications that demand very high bandwidth and low latency, as well as supporting resource-constrained nodes reachable only over unreliable network connections [1]- [4]. Along with the burgeoning development of machine learning, it is expected H. H. Yang that by leveraging computing capability in the edge nodes, usually access points (APs), future networks will be able to utilize local data to conduct intelligent inference and control on many activities, e.g., learning activities of mobile phone users, predicting health events from wearable devices, or detecting burglaries within smart homes [5], [6]. Due to the sheer volume of data generated, as well as the growing capability of computational power and the increasing concerns about sharing private data at end-user devices, it becomes more attractive to perform learning directly on user equipments (UEs) as opposed to sending raw data to an AP.…”
Section: Introductionmentioning
confidence: 99%
“…Next-generation computing networks will encounter a paradigm shift from a conventional cloud computing setting, which aggregates computational resources in a data center, to edge computing systems which largely deploy computational power to the network edges to meet the needs of applications that demand very high bandwidth and low latency, as well as supporting resource-constrained nodes reachable only over unreliable network connections [1]- [4]. Along with the burgeoning development of machine learning, it is expected H. H. Yang that by leveraging computing capability in the edge nodes, usually access points (APs), future networks will be able to utilize local data to conduct intelligent inference and control on many activities, e.g., learning activities of mobile phone users, predicting health events from wearable devices, or detecting burglaries within smart homes [5], [6]. Due to the sheer volume of data generated, as well as the growing capability of computational power and the increasing concerns about sharing private data at end-user devices, it becomes more attractive to perform learning directly on user equipments (UEs) as opposed to sending raw data to an AP.…”
Section: Introductionmentioning
confidence: 99%
“…For Model Training: To the best of our knowledge, for model training, all proposed frameworks are distributed, except those knowledge distillation-based ones. The distributed training frameworks can be divided into data split and model split [24]. Data split can be further divided into masterdevice split, helper-device split and device-device split.…”
Section: A Recapitulation Of Aiementioning
confidence: 99%
“…We assume that δ k is dependent on w k . It can be seen that u is an approximation of a for the aggregation in (3). To see the approximation error, we can consider the following conditional error norm:…”
Section: B Adaptive Access Probability Based On Local Updatementioning
confidence: 99%
“…Federated learning [1]- [3] has been extensively studied as a distributed machine learning approach with data privacy. In federated learning, mobile phones or devices keep their data sets and exchange a parameter vector to be optimized in a certain learning problem (with data sets that are kept at devices or users).…”
Section: Introductionmentioning
confidence: 99%