2017 IEEE 42nd Conference on Local Computer Networks (LCN) 2017
DOI: 10.1109/lcn.2017.112
|View full text |Cite
|
Sign up to set email alerts
|

Energy-Efficient Resource Allocation for Cache-Assisted Mobile Edge Computing

Abstract: In this paper, we jointly consider communication, caching and computation in a multi-user cache-assisted mobile edge computing (MEC) system, consisting of one base station (BS) of caching and computing capabilities and multiple users with computation-intensive and latency-sensitive applications. We propose a joint caching and offloading mechanism which involves task uploading and executing for tasks with uncached computation results as well as computation result downloading for all tasks at the BS, and efficie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
51
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(51 citation statements)
references
References 13 publications
0
51
0
Order By: Relevance
“…Moreover, for MEC applications, in [27], the authors explored the fundamental tradeoffs between caching, computing, and communication for VR/AR applications. Finally, the work in [28] proposed a joint caching and offloading mechanism that considers task uploading and executing, computation output downloading, multi-user diversity, and multi-casting.…”
Section: Joint Caching and Communication (2c)mentioning
confidence: 99%
“…Moreover, for MEC applications, in [27], the authors explored the fundamental tradeoffs between caching, computing, and communication for VR/AR applications. Finally, the work in [28] proposed a joint caching and offloading mechanism that considers task uploading and executing, computation output downloading, multi-user diversity, and multi-casting.…”
Section: Joint Caching and Communication (2c)mentioning
confidence: 99%
“…The use of caching to dynamically store the program and/or task data at the MEC system has been recently recognized as a cost-effective method to reduce computation delay, energy consumption, and bandwidth cost [3], [10]. Here, we refer to the techniques to cache the input and/or output of computation tasks at the server/user side as computation content caching (such as in [3]- [7]). On one hand, content caching reduces the data exchange between the edge servers and MUs if the required input data can be found in the cache.…”
Section: A Motivations and Summary Of Contributionsmentioning
confidence: 99%
“…Integrating content caching into MEC system design can effectively reduce computation delay, energy consumption, and bandwidth cost. In particular, an edge server can cache task output data [3], task input data [4], and intermediate task computation results that are potentially useful for future task executions [5]. Meanwhile, content caching can also be implemented at the MU side to minimize the offloading (downloading) traffic to (from) the edge server [6].…”
Section: B Related Workmentioning
confidence: 99%
“…x k,n f,j = x k f,j , f ∈ F , j ∈ {1, 2, 3, 4}, k ∈ K, n ∈ N , (32) where x = (x k f,j ) f ∈F ,j∈{1,2,3,4},k∈K in constraints (21), (22) and (23) is replaced with {x n } n∈N (x k,n f,j ) f ∈F ,j∈{1,2,3,4},k∈K,n∈N in constraints (29), (30) and (31).…”
Section: B Optimal Policy Design For α >mentioning
confidence: 99%