2021
DOI: 10.1109/tpds.2021.3132417
|View full text |Cite
|
Sign up to set email alerts
|

Lossy Compression of Communication Traces Using Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 38 publications
0
7
0
Order By: Relevance
“…This definition is based on two points. (1) Existing studies [13,15,24,31,34,43] prefer to use the interval between two communication events to indicate a portion of computation time cost. Our definition aligns with the convention.…”
Section: Tracing Computation Eventsmentioning
confidence: 99%
See 4 more Smart Citations
“…This definition is based on two points. (1) Existing studies [13,15,24,31,34,43] prefer to use the interval between two communication events to indicate a portion of computation time cost. Our definition aligns with the convention.…”
Section: Tracing Computation Eventsmentioning
confidence: 99%
“…The major difficulty is the huge storage consumption of behavior record, which is often called trace, from multiple cluster nodes. For instance, tracing a single execution of a mini-app LULESH [18] on less than 1,000 processors can produce hundreds of gigabytes trace data [31]. As a compromise, existing methods mainly focus on communication trace [13,15,24,31,34,43], since communication events constitute the skeleton of an HPC application.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations