2017
DOI: 10.1145/3149371
|View full text |Cite
|
Sign up to set email alerts
|

TinyLFU

Abstract: This article proposes to use a frequency-based cache admission policy in order to boost the effectiveness of caches subject to skewed access distributions. Given a newly accessed item and an eviction candidate from the cache, our scheme decides, based on the recent access history, whether it is worth admitting the new item into the cache at the expense of the eviction candidate. This concept is enabled through a novel approximate LFU structure called … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 91 publications
(6 citation statements)
references
References 43 publications
0
3
0
Order By: Relevance
“…In addition to the traditionally explored access history and cache-related statistics in existing admission solutions [12,29,28], such as frequency and recency, our approach also considers caching metadata retrieved from the application during its execution, such as cost to retrieve, user sessions, cache size and data size. Such metadata are in fact information that developers use while designing and implementing application-level caching, and thus enrich the application model with valuable application-specific information regarding the applicability of caching.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition to the traditionally explored access history and cache-related statistics in existing admission solutions [12,29,28], such as frequency and recency, our approach also considers caching metadata retrieved from the application during its execution, such as cost to retrieve, user sessions, cache size and data size. Such metadata are in fact information that developers use while designing and implementing application-level caching, and thus enrich the application model with valuable application-specific information regarding the applicability of caching.…”
Section: Discussionmentioning
confidence: 99%
“…Also focusing on filtering content, TinyLFU [28] uses an approximate LFU structure, which maintains a representation of the access frequency of recently accessed contents, to boost the admission effectiveness of caches. TinyLFU acts reactively when the cache is full and decides whether it is worthwhile admitting content, considering the cost of an eviction and the usefulness of the new content.…”
Section: Identification Of Caching Opportunitiesmentioning
confidence: 99%
“…Caching Algorithms. Caching algorithms distinguish the hotness of objects using recency [73,86], frequency [21] and other access information [11], or combining various information together [7,[10][11][12] to get higher hit rates. Recently, there are many machine-learning-based adaptive caching algorithms [6,47,59,74].…”
Section: Related Workmentioning
confidence: 99%
“…The conventional approach to caching, the Most Popular Content (MPC) policy, fills each cache with the most popular items [4]. In the case when the popularity of content is unknown, policies, such as Least Recently Used (LRU) [5], Least Frequently Used (LFU) [6] and Time-To-Live (TTL) [7], have been designed to achieve a placement that performs closely to the one of MPC. Otherwise, if the popularity is known or predicted, there are policies [8]- [17] that outperform MPC in networks.…”
Section: A Related Workmentioning
confidence: 99%
“…The authors of [8] proved that P ICP is a concave function of caching probability vector {q j } j∈[n] , and give the hit probability under ICP by solving a concave maximization problem with constraint (6) ICP: Maximize:…”
Section: A Independent Caching Policymentioning
confidence: 99%