2018
DOI: 10.1186/s13638-018-1218-y
|View full text |Cite
|
Sign up to set email alerts
|

Proactive edge computing in fog networks with latency and reliability guarantees

Abstract: This paper studies the problem of task distribution and proactive edge caching in fog networks with latency and reliability constraints. In the proposed approach, user nodes (UNs) offload their computing tasks to edge computing servers (cloudlets). Cloudlets leverage their computing and storage capabilities to proactively compute and store cacheable computing results. In this regard, a task popularity estimation and caching policy schemes are proposed. Furthermore, the problem of UNs' tasks distribution to clo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
34
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(34 citation statements)
references
References 27 publications
0
34
0
Order By: Relevance
“…As a consequence, in the final step after full-text reading and majority decision only 27 papers remained as Relevant papers for this study. Upon detailed reading, we found that [11] and [12] are similar to a large extent considering the underlying theme, model and approach, which can have a slight bias on the study results. However, since [12] looks like a journal version of [11] with improvements, we have decided to keep both of them, to be consistent with our predefined selection rules IV.…”
Section: Selecting Papersmentioning
confidence: 87%
See 1 more Smart Citation
“…As a consequence, in the final step after full-text reading and majority decision only 27 papers remained as Relevant papers for this study. Upon detailed reading, we found that [11] and [12] are similar to a large extent considering the underlying theme, model and approach, which can have a slight bias on the study results. However, since [12] looks like a journal version of [11] with improvements, we have decided to keep both of them, to be consistent with our predefined selection rules IV.…”
Section: Selecting Papersmentioning
confidence: 87%
“…Upon detailed reading, we found that [11] and [12] are similar to a large extent considering the underlying theme, model and approach, which can have a slight bias on the study results. However, since [12] looks like a journal version of [11] with improvements, we have decided to keep both of them, to be consistent with our predefined selection rules IV. TECHNICAL DATA ANALYSIS Data extracted from relevant papers are analyzed based on four main categories: 1) Dependability Attributes; 2) Source of Threats 3) Dependability Means (Fault, Failure and Errors) and 4) Threat Detection and Response Methods.…”
Section: Selecting Papersmentioning
confidence: 87%
“…Proposing an algorithm for energy efficiency (EE) in mobile devices by optimizing queue complexity of the communication process [67] Reducing computation and latency for IoT devices using MEC…”
Section: [42]mentioning
confidence: 99%
“…Proposing an algorithm for energy efficiency (EE) in mobile devices by optimizing queue complexity of the communication process [67] Reducing computation and latency for IoT devices using MEC AI and 5G Networks Traffic Management AI can be integrated with 5G networks to improve the efficiency of resource and network management. The network architecture and user requirements for 5G networks, the traffic management will be a challenge [68].…”
Section: [42]mentioning
confidence: 99%
“…In addition, mmWave also enables wireless backhauling [15], [16] that facilitates edge servers' prefetching popular content with low latency. At the processing level, proactive computing provides significant latency reduction while maximizing resource efficiency by avoiding repetitive and redundant on-demand computing [17]- [19]. Next, coded computing is effective in reducing parallel computing latency, which eliminates the dependency of processing tasks, thereby minimizing the worst-case latency due to a straggling task.…”
Section: Low Latency Enablersmentioning
confidence: 99%