Proceedings of the 11th International Workshop on Data Management on New Hardware 2015
DOI: 10.1145/2771937.2771943
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Lightweight Compression Alongside Fast Scans

Abstract: The increasing main-memory capacity has allowed query execution to occur primarily in main memory. Database systems employ compression, not only to fit the data in main memory, but also to address the memory bandwidth bottleneck. Lightweight compression schemes focus on efficiency over compression rate and allow query operators to process the data in compressed form. For instance, dictionary compression keeps the distinct column values in a sorted dictionary and stores the values as index codes with the minimu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(16 citation statements)
references
References 25 publications
(21 reference statements)
0
15
0
Order By: Relevance
“…Let us now turn our attention to data-parallel execution using SIMD operations. There has been extensive research investigating SIMD for database operations [51,50,36,37,38,35,46,44]. It is not surprising that this research generally assumes a vectorized execution model.…”
Section: Data-parallel Execution (Simd)mentioning
confidence: 99%
“…Let us now turn our attention to data-parallel execution using SIMD operations. There has been extensive research investigating SIMD for database operations [51,50,36,37,38,35,46,44]. It is not surprising that this research generally assumes a vectorized execution model.…”
Section: Data-parallel Execution (Simd)mentioning
confidence: 99%
“…We also note that recent work on advanced bit-packing schemes [22,13] and their SIMD implementation [27] focus on the benefits of early filtering and either ignore the high per-tuple cost of bit-unpacking, or just position these bit-packed formats as a secondary storage structure. Our choice for byte-aligned storage mostly avoids this penalty and makes early filtering beneficial in a broader range of query workloads (see Section 5.4).…”
Section: Related Workmentioning
confidence: 99%
“…Vectorization enables the use of SIMD and facilitates algorithms that access data belonging to multiple tuples in parallel. In the database context, SIMD instructions have been used for example, to speed up selection scans [37,35,26], for bit unpacking [35,27], bulk loading [24], sorting [6], and breadth-first search on graphs [32].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations