2021
DOI: 10.48550/arxiv.2111.08824
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The Case for Learned In-Memory Joins

Abstract: In-memory join is an essential operator in any database engine. It has been extensively investigated in the database literature. In this paper, we study whether exploiting the CDF-based learned models to boost the join performance is practical or not. To the best of our knowledge, we are the first to fill this gap. We investigate the usage of CDF-based partitioning and learned indexes (e.g., RMI and RadixSpline) in the three join categories; indexed nested loop join (INLJ), sort-based joins (SJ) and hash-based… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…Other works that are related to this thesis are the Learned In-Memory Joins [SK21], particularly Learned Sort-Join. Learned Sort-Join profits from the eCDF modeling approach discussed earlier to speed up both the partitioning and the chunked-join phases of the algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…Other works that are related to this thesis are the Learned In-Memory Joins [SK21], particularly Learned Sort-Join. Learned Sort-Join profits from the eCDF modeling approach discussed earlier to speed up both the partitioning and the chunked-join phases of the algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…In addition to the above, other lines of research have focused on learned data structures [6] used for efficiently querying large indices [10][11][12][13], automatic database configuration tuning, accurate query run time prediction and other problems [19].…”
Section: Introductionmentioning
confidence: 99%