The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2022
DOI: 10.1155/2022/8374181
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Cooperative Cache Management Scheme Based on Social and Popular Data in Vehicular Named Data Network

Abstract: Vehicular Named Data Network (VNDN) is considered a strong paradigm to deploy in vehicular applications. In VNDN, each node has its cache, but due to limited cache, it directly affects the performance in a highly dynamic environment, which requires massive and fast content delivery. To reduce these issues, the cooperative caching plays an efficient role in VNDN. Most studies regarding cooperative caching focus on content replacement and caching algorithms and implement these methods in a static environment rat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 50 publications
0
5
0
Order By: Relevance
“…DCCMS introduces an innovative approach to a cache-management technique that prioritizes content based on popularity and social interactions among nodes. It incorporates a master node concept for hierarchical collaboration and content distribution, focusing on maximizing the use of cache resources and minimizing content delivery latency [17]. LFU is a caching algorithm that removes the least frequently used items from the cache first.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…DCCMS introduces an innovative approach to a cache-management technique that prioritizes content based on popularity and social interactions among nodes. It incorporates a master node concept for hierarchical collaboration and content distribution, focusing on maximizing the use of cache resources and minimizing content delivery latency [17]. LFU is a caching algorithm that removes the least frequently used items from the cache first.…”
Section: Methodsmentioning
confidence: 99%
“…Additionally, a dynamic cooperative cache management scheme [17] was suggested, relying on popular and social data and involving a master node that operates hierarchically with nearby nodes to retain frequently accessed contents. However, in this scheme, the master node may experience a bottleneck during high network activity.…”
Section: Related Workmentioning
confidence: 99%
“…These stream applications exhibit high data parallelism, computational intensity, and data locality characteristics. 1,2 Compared to traditional desktop applications, stream applications perform intensive arithmetic operations on each piece of data retrieved from internal memory. Most computations in stream applications can be parallelized at the data, thread, and task levels.…”
Section: Introductionmentioning
confidence: 99%
“…This integration enables information gathering, input, storage, processing, and output on a single chip. 1,2 Modern embedded systems, including mobile phones and game consoles, place high demands on multimedia processor performance, particularly for graphics, images, and videos. As a result, Graphics Processing Units (GPUs) are often integrated into SoC chips.…”
Section: Introductionmentioning
confidence: 99%
“…CPU access requests are typically latency-sensitive, requiring quick service, while GPU access requests are bandwidth-sensitive, necessitating high-bandwidth service to ensure real-time image processing. 1,2 Consequently, the shared utilization mode of on-chip cache has a certain impact on the performance of both CPUs and GPUs, as it becomes challenging to meet the low latency demands of CPUs and the high bandwidth requirements of GPUs simultaneously. As the integration of CPUs and GPUs on SoC chips continues to increase, the issue of memory access contention between the two processing units becomes a pressing technical problem that needs urgent resolution.…”
Section: Introductionmentioning
confidence: 99%