SC20: International Conference for High Performance Computing, Networking, Storage and Analysis 2020
DOI: 10.1109/sc41405.2020.00087
|View full text |Cite
|
Sign up to set email alerts
|

Foresight: Analysis That Matters for Data Reduction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(15 citation statements)
references
References 31 publications
0
15
0
Order By: Relevance
“…To control the accuracy in the reduced data error-bounded lossy compression algorithms such as SZ [3] and ZFP [4] provide order-ofmagnitude larger reductions than lossless compression while meeting a users specified level of data fidelity. However, setting the compressor's error bound to ensure data fidelity is an open question [6], [18], [19].…”
Section: A Lossy Data Reductionmentioning
confidence: 99%
“…To control the accuracy in the reduced data error-bounded lossy compression algorithms such as SZ [3] and ZFP [4] provide order-ofmagnitude larger reductions than lossless compression while meeting a users specified level of data fidelity. However, setting the compressor's error bound to ensure data fidelity is an open question [6], [18], [19].…”
Section: A Lossy Data Reductionmentioning
confidence: 99%
“…We conduct our evaluation with Foresight [16], an open-source toolkit used to evaluate, analyze, and visualize lossy compressors for extreme-scale cosmological simulations . We modified the toolkit so that we can gather necessary parameters for our framework to deploy adaptive lossy compression configuration to various data partitions in Nyx cosmological simulation.…”
Section: Experimental Setup and Datasetmentioning
confidence: 99%
“…Apart from fine-grained adaptive compression, we must also be able to precisely control the compression error for domain-specific post-hoc analysis. Research has shown that general-purpose data distortion metrics, such as peak signal-to-noise ratio (PSNR), normalized root-mean-square error, mean relative error (MRE), and mean square error (MSE), on their own cannot satisfy the demand of quality for cosmological simulation post-hoc analysis [16,21]. For example, PSNR does not tell us how the mass of a halo would be impacted after compression.…”
Section: Introductionmentioning
confidence: 99%
“…Compression of scientific data has been identified as a major data reduction technique to address this issue. More specifically, the new generation of error-bounded lossy compression techniques, such as SZ [5]- [7] and ZFP [8], have been widely used in the scientific community [4]- [13]. Compared to lossless compression that typically achieves only 2× compression ratio [14] on scientific data, error-bounded lossy compressors provide much higher compression ratios with controllable loss of accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…However, for HDF5 to take advantage of lossy compressors, it is essential for users to identify the optimal trade-off between the compression ratio and compressed data quality, which is fairly complex. Since there is no analytical model available to foresee/estimate the compression quality accurately, the configuration setting (such as error bound types and values) of error-bounded lossy compressors for scientific applications relies on empirical validations/studies based on domain scientists' trial-and-error experiments [11]- [13]. The trial-anderror method 1 suffers from two significant drawbacks, which leads to significant issues in practice.…”
Section: Introductionmentioning
confidence: 99%