2020
DOI: 10.1109/tcomm.2020.2970950
|View full text |Cite
|
Sign up to set email alerts
|

Device-to-Device Coded-Caching With Distinct Cache Sizes

Abstract: This paper considers a cache-aided device-to-device (D2D) system where the users are equipped with cache memories of different size. During low traffic hours, a server places content in the users' cache memories, knowing that the files requested by the users during peak traffic hours will have to be delivered by D2D transmissions only. The worst-case D2D delivery load is minimized by jointly designing the uncoded cache placement and linear coded D2D delivery. Next, a novel lower bound on the D2D delivery load … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 37 publications
(15 citation statements)
references
References 50 publications
0
13
0
Order By: Relevance
“…Depending on the network topology, this coded variant can be a more powerful approach than traditional prefetching, because it employs caching not only to change the volume of the communication problem, but also to change the structure of the problem itself, simply by changing the interference patterns. Coded caching has been rightfully credited with being able to transform memory into data rates, and has hence sparked a flurry of research on a variety of topics such as on the interplay between caching and PHY [11]- [21], caching and privacy [22]- [24], on information-theoretic converses [25], [26], on the critical bottleneck of subpacketization [27]- [31], and a variety of other scenarios [32]- [37].…”
Section: Introductionmentioning
confidence: 99%
“…Depending on the network topology, this coded variant can be a more powerful approach than traditional prefetching, because it employs caching not only to change the volume of the communication problem, but also to change the structure of the problem itself, simply by changing the interference patterns. Coded caching has been rightfully credited with being able to transform memory into data rates, and has hence sparked a flurry of research on a variety of topics such as on the interplay between caching and PHY [11]- [21], caching and privacy [22]- [24], on information-theoretic converses [25], [26], on the critical bottleneck of subpacketization [27]- [31], and a variety of other scenarios [32]- [37].…”
Section: Introductionmentioning
confidence: 99%
“…However, during the peak traffic period, the current best way is to cache the content in MDs, and users can mainly transmit the requested content through D2D. In a cache-enabled MDs system, Ibrahim et al [ 78 ] proposed a coded caching scheme that minimizes the worst-case delivery load for D2D-based content delivery to users with unequal cache sizes. They considered the situation of users with different storage capabilities to minimize the D2D transmission load under poor network conditions.…”
Section: Caching Policiesmentioning
confidence: 99%
“…For heterogeneous cache-aided D2D networks where users are equipped with cache memories of distinct sizes, ref. [ 39 ] minimized the delivery load by optimizing over the partition during the placement phase and the size and structure of D2D during the delivery phase. A highly dense wireless network with device mobility was investigated in [ 40 ].…”
Section: Introductionmentioning
confidence: 99%