2023
DOI: 10.3390/app131810297
|View full text |Cite
|
Sign up to set email alerts
|

Memory Allocation Strategy in Edge Programmable Logic Controllers Based on Dynamic Programming and Fixed-Size Allocation

Guanghe Cheng,
Zhong Wan,
Wenkang Ding
et al.

Abstract: With the explosive growth of data at the edge in the Industrial Internet of Things (IIoT), edge devices are increasingly performing more data processing tasks to alleviate the load on cloud servers. To achieve this goal, Programmable Logic Controllers (PLCs) are gradually transitioning into edge PLCs. However, efficiently executing a large number of computational tasks in memory-limited edge PLCs is a significant challenge. Therefore, there is a need to design an efficient memory allocation strategy for edge P… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 27 publications
0
1
0
Order By: Relevance
“…For this reason, the concept of edge computing has emerged to address these challenges by conducting computations at the edge of the network [2]. In edge computing, the computing process is performed at the edge of the network using devices ranging from smartphones to programmable logic controllers (PLCs) [3]. Since the data produced by IoT devices is processed at Corresponding author: Huaming Wu the edge of the network, it provides great advantages in realtime applications (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…For this reason, the concept of edge computing has emerged to address these challenges by conducting computations at the edge of the network [2]. In edge computing, the computing process is performed at the edge of the network using devices ranging from smartphones to programmable logic controllers (PLCs) [3]. Since the data produced by IoT devices is processed at Corresponding author: Huaming Wu the edge of the network, it provides great advantages in realtime applications (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…Neural network workloads consist of compute-bound tasks [8,9], but loading data from storage may degrade the performance of the workloads since accessing storage is 10 5 to 10 6 times slower than computing in processors [10], and huge amounts of data can be consistently referenced during the training phase of neural network workloads [7]. Because systems do not have sufficient memory space to accommodate the entire dataset [11], traditional systems typically take advantage of the buffer cache for improving the loading time of data from storage [12].…”
Section: Introductionmentioning
confidence: 99%