2020
DOI: 10.1016/j.simpat.2019.102007
|View full text |Cite
|
Sign up to set email alerts
|

Architecture and performance evaluation of distributed computation offloading in edge computing

Abstract: Edge computing has been proposed to cope with the challenging requirements of future applications, like mobile augmented reality, since it shortens significantly the distance, hence the latency, between the end users and the processing servers. On the other hand, serverless computing is emerging among cloud technologies to respond to the need of highly scalable event-driven execution of stateless tasks. In this paper, we first investigate the convergence of the two to enable very low-latency execution of short… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 43 publications
0
10
0
Order By: Relevance
“…We have used our own performance evaluation framework with real applications running in Linux namespaces interconnected via a network emulated with Mininet 6 . The interested reader may find full details in [10], which also describes the details of the implementation of the e-routers and e-controller. As executors we have used both OpenWhisk (as in [4]) running in Docker containers 7 and emulated e-computers, which provide responses to lambda requests based on the simulation of processing of tasks in a multi-processing system, also illustrated in [10].…”
Section: A Environment and Methodologymentioning
confidence: 99%
“…We have used our own performance evaluation framework with real applications running in Linux namespaces interconnected via a network emulated with Mininet 6 . The interested reader may find full details in [10], which also describes the details of the implementation of the e-routers and e-controller. As executors we have used both OpenWhisk (as in [4]) running in Docker containers 7 and emulated e-computers, which provide responses to lambda requests based on the simulation of processing of tasks in a multi-processing system, also illustrated in [10].…”
Section: A Environment and Methodologymentioning
confidence: 99%
“…Offloading computation to the edge has been studied in [9] and [8], and edge computing and related paradigms have been surveyed by [10], [14], [28] among others. In [29] the authors propose a method for the dynamic management of service resources for cloud, fog and edge based CPPS.…”
Section: Related Workmentioning
confidence: 99%
“…The challenge is to analyze data efficiently when striving for data-driven operations [1], [7]. In cloud computing, transmission of data, allocation of processing resources, and finally making results available typically includes delays and transfer costs [8], [9].…”
Section: Introductionmentioning
confidence: 99%
“…Serverless Offloading. Several research efforts have also explored how to leverage serverless computing to augment local devices' limitations [8,9,28,50]. Our design choice of using the FaaS paradigm was partly inspired by DIY [50], which explores executing private web services such as E-mail on serverless platforms.…”
Section: Related Workmentioning
confidence: 99%