2021
DOI: 10.3390/s21248416
|View full text |Cite
|
Sign up to set email alerts
|

Mitigating Cold Start Problem in Serverless Computing with Function Fusion

Abstract: [sangyoon]As Artificial Intelligence (AI) is becoming ubiquitous in many applications, serverless computing is also emerging as a building block for developing cloud-based AI services. Serverless computing has received much interest because of its simplicity, scalability, and resource efficiency. However, due to the trade-off with resource efficiency, serverless computing suffers from the cold start problem, that is, a latency between a request arrival and function execution[sangyoon] that is encountered due t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(3 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…Specifically, a model can set functions' pre-warm time and keep-alive time based on dependency-predictable functions' invocation histograms, while using fixed-timeout policy for dependency-unpredictable functions [19]. Function-level cold-start problems can also be manged by reducing application code loading latency by loading critical functions first and optional functions later [20], or by fusing and optimizing function codes for more efficient run-time execution [21], [22].…”
Section: A Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Specifically, a model can set functions' pre-warm time and keep-alive time based on dependency-predictable functions' invocation histograms, while using fixed-timeout policy for dependency-unpredictable functions [19]. Function-level cold-start problems can also be manged by reducing application code loading latency by loading critical functions first and optional functions later [20], or by fusing and optimizing function codes for more efficient run-time execution [21], [22].…”
Section: A Previous Workmentioning
confidence: 99%
“…Finergrain code dependencies can be inferred and pass to path A to assist with future demand inferences. Function codes can then be optimized per [21], [22]. Based on function instances' arrival time, fine-grain requirements, and optimized function code graphs, instances can be prioritized.…”
Section: B the Ensemble Policymentioning
confidence: 99%
“…Before each test, we called the function chain several times to ensure that all cloud functions have live instances, in order to avoid the impact of a cold start (i.e., set-up time required that is invoked when downloading the code, starting the instance, and initializing the runtime, and so on, for the first time) [3,6], which has been studied extensively with some work also focusing on the optimization for the chain structure [24,25]. The call chain length was set from 1 to 4 and, for each run, we conducted 20 tests and calculated the average time.…”
Section: Introductionmentioning
confidence: 99%