2017
DOI: 10.1109/tccn.2017.2725277
|View full text |Cite
|
Sign up to set email alerts
|

Online Learning for Offloading and Autoscaling in Energy Harvesting Mobile Edge Computing

Abstract: Mobile edge computing (a.k.a. fog computing) has recently emerged to enable in-situ processing of delay-sensitive applications at the edge of mobile networks. Providing grid power supply in support of mobile edge computing, however, is costly and even infeasible (in certain rugged or under-developed areas), thus mandating on-site renewable energy as a major or even sole power supply in increasingly many scenarios. Nonetheless, the high intermittency and unpredictability of renewable energy make it very challen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
206
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 343 publications
(207 citation statements)
references
References 25 publications
0
206
0
Order By: Relevance
“…3 while |a 1 − a 0 | > ε do 4 if S(λm)<S(γm) then 5 a 0 = λm, a 1 = a 1 , λ m+1 = γm; 6 γ m+1 = a 0 + σ · (a 1 − a 0 ); 7 Update S(λ m+1 ) ← S(γm) and calculate S(γ m+1 ). a 0 = a 0 , a 1 = γm, γ m+1 = λm; 10 λ m+1 = a 0 + (1 − σ) · (a 1 − a 0 ); 11 Update S(γ m+1 ) ← S(λm) and calculate S(λ m+1 ). for both devices.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…3 while |a 1 − a 0 | > ε do 4 if S(λm)<S(γm) then 5 a 0 = λm, a 1 = a 1 , λ m+1 = γm; 6 γ m+1 = a 0 + σ · (a 1 − a 0 ); 7 Update S(λ m+1 ) ← S(γm) and calculate S(γ m+1 ). a 0 = a 0 , a 1 = γm, γ m+1 = λm; 10 λ m+1 = a 0 + (1 − σ) · (a 1 − a 0 ); 11 Update S(γ m+1 ) ← S(λm) and calculate S(λ m+1 ). for both devices.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…How to apply machine learning algorithms for user association or task offloading in MEC systems was also studied in some recent works [15][16][17][18]. Deep Q-learning was used to minimize the task execution cost by optimizing offloading decision according to channel state information, queue state information, and energy queue state of the energy harvesting system [15].…”
Section: Related Workmentioning
confidence: 99%
“…A similar method was also applied for energy harvesting of IoT devices in [16]. The authors of [17] proposed an efficient reinforcement learning-based resource management algorithm to incorporate renewable energy into MEC systems. More recently, a deep reinforcement learning framework for task offloading was studied in a single-AP scenario [18].…”
Section: Related Workmentioning
confidence: 99%
“…Our work is also related to online learning techniques which have been applied in solving decision making problems in social sensing applications [33,34,35,36,37,38]. In particular, online learning learns to make sequential decisions to achieve the desired quality-of-service of an application and dynamically adjust the learning process based on the streaming data re- [37]. To the best of our knowledge, the QCO-TA scheme is one of the first approaches to leverage online learning techniques to address the quality-cost-aware multi-attribute task allocation problem in social sensing.…”
Section: Online Learningmentioning
confidence: 99%