Proceedings of the 2021 ACM-SIAM Symposium on Discrete Algorithms (SODA) 2021
DOI: 10.1137/1.9781611976465.180
|View full text |Cite
|
Sign up to set email alerts
|

Tight Bounds for Parallel Paging and Green Paging

Abstract: In the parallel paging problem, there are p processors that share a cache of size k. The goal is to partition the cache among the processors over time in order to minimize their average completion time. For this long-standing open problem, we give tight upper and lower bounds of Θ(log p) on the competitive ratio with O(1) resource augmentation.A key idea in both our algorithms and lower bounds is to relate the problem of parallel paging to the seemingly unrelated problem of green paging. In green paging, there… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 30 publications
(65 reference statements)
0
2
0
Order By: Relevance
“…There are very few theoretical guarantees, however, for performance of these algorithms. Furthermore, most existing guarantees on online multicore caching algorithms are nega-tive [31,25], but resource augmentation may be helpful in some cases [1,2].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…There are very few theoretical guarantees, however, for performance of these algorithms. Furthermore, most existing guarantees on online multicore caching algorithms are nega-tive [31,25], but resource augmentation may be helpful in some cases [1,2].…”
Section: Introductionmentioning
confidence: 99%
“…We use "multicore caching" because it more accurately reflects the problem studied in this paper. 2 For a cost-minimization problem, an online algorithm has a competitive ratio of c if its cost on any input never exceeds c times the cost of an optimal offline algorithm for the same input (up to an additive constant). 3 LRU is an online caching algorithm that evicts the leastrecently-requested page.…”
Section: Introductionmentioning
confidence: 99%