2017 IEEE 19th International Conference on High Performance Computing and Communications; IEEE 15th International Conference On 2017
DOI: 10.1109/hpcc-smartcity-dss.2017.1
|View full text |Cite
|
Sign up to set email alerts
|

Analysis and Modeling of the End-to-End I/O Performance on OLCF's Titan Supercomputer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 26 publications
0
2
0
Order By: Relevance
“…Such a large amount of data is often generated in a parallel manner from a scaling number of ranks, on which each holds a proportion of the data and must introduce an extra collective communication to dump the entire snapshot to the file system. This process takes an unprecedented challenge to I/O bandwidths and storage systems on today's HPC systems [7,30,54,55]. Therefore, it is urgent to develop effective data reduction methods to reduce the size of data movement between memories and storage systems such as parallel file systems.…”
Section: Introductionmentioning
confidence: 99%
“…Such a large amount of data is often generated in a parallel manner from a scaling number of ranks, on which each holds a proportion of the data and must introduce an extra collective communication to dump the entire snapshot to the file system. This process takes an unprecedented challenge to I/O bandwidths and storage systems on today's HPC systems [7,30,54,55]. Therefore, it is urgent to develop effective data reduction methods to reduce the size of data movement between memories and storage systems such as parallel file systems.…”
Section: Introductionmentioning
confidence: 99%
“…With the increase in scale of such simulations, saving all the raw data generated to disk becomes impractical due to: 1) limited storage capacity, and 2) the I/O bandwidth required to save this data to disk can create bottlenecks in the simulation [4,38,39] . For example, one Nyx simulation with a resolution of 4096 × 4096 × 4096 cells can generate up to 2.8 TB of data for a single snapshot; a total of 2.8 PB of disk storage is needed assuming running the simulation 5 times with 200 snapshots dumped per simulation.…”
Section: Introductionmentioning
confidence: 99%