2022 CPSSI 4th International Symposium on Real-Time and Embedded Systems and Technologies (RTEST) 2022
DOI: 10.1109/rtest56034.2022.9850011
|View full text |Cite
|
Sign up to set email alerts
|

The Effect of Fog Offloading on the Energy Consumption of Computational Nodes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 14 publications
0
0
0
Order By: Relevance
“…This paper significantly extends our previous work [9] by (1) incorporating patient mobility into our model, creating a dynamic fog computing environment with varying data generation rates, and (2) conducting extensive experiments for the comprehensiveness of our findings.…”
mentioning
confidence: 69%
See 1 more Smart Citation
“…This paper significantly extends our previous work [9] by (1) incorporating patient mobility into our model, creating a dynamic fog computing environment with varying data generation rates, and (2) conducting extensive experiments for the comprehensiveness of our findings.…”
mentioning
confidence: 69%
“…Among these challenges, in this work, we focused on computation offloading. Computation offloading enables IoT devices to offload computation tasks to fog or cloud servers and receive the results after the servers execute the tasks [8]. In this way, we can free end devices from heavy computing tasks, reduce their energy consumption, and significantly advance the completion time of tasks.…”
Section: Introductionmentioning
confidence: 99%
“…Two cameras are initially attached per fog device which is increased in each succeeding scenario. The sensors installed in our evaluations are according to the strategy of Sharifi, Hessabi & Rasaii (2022) . The simulation model of one of each setup created in iFogSim for the evaluation of the fog paradigm and cloud computing paradigm is shown in Figs.…”
Section: Resultsmentioning
confidence: 99%
“…Each device in the system is accountable for the implementation of some application modules ( Hassan et al, 2020 ). Different amounts of resources are available at fog devices existing in heterogeneous fog computing environments ( Sharifi, Hessabi & Rasaii, 2022 ). These fog nodes provide a reduction in latency by processing information near the sensor nodes.…”
Section: Proposed Paradigm and Problem Formulationmentioning
confidence: 99%