Proceedings of the Conference on High Performance Computing Networking, Storage and Analysis 2009
DOI: 10.1145/1654059.1654115
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating similarity-based trace reduction techniques for scalable performance analysis

Abstract: Event traces are required to correctly diagnose a number of performance problems that arise on today's highly parallel systems. Unfortunately, the collection of event traces can produce a large volume of data that is difficult, or even impossible, to store and analyze. One approach for compressing a trace is to identify repeating trace patterns and retain only one representative of each pattern. However, determining the similarity of sections of traces, i.e., identifying patterns, is not straightforward. In th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(15 citation statements)
references
References 41 publications
0
15
0
Order By: Relevance
“…Trace reduction is the compression of traces in some manner (either lossless or lossy) so that they can be stored and processed efficiently (Kaplan et al 1999;Mohror and Karavanic 2009). The process of collapsing the commit graph can be seen as a particular case of trace reduction.…”
Section: Trace Reduction Methods and Automatic Collapsingmentioning
confidence: 99%
“…Trace reduction is the compression of traces in some manner (either lossless or lossy) so that they can be stored and processed efficiently (Kaplan et al 1999;Mohror and Karavanic 2009). The process of collapsing the commit graph can be seen as a particular case of trace reduction.…”
Section: Trace Reduction Methods and Automatic Collapsingmentioning
confidence: 99%
“…Our method is based on an idea by Mohror and Karavanic, who introduce a method that uses wavelets for direct similarity checking using distance metrics [29]. However, we generalise the idea under the assumption that comparing time and frequency coefficients between the complementary wavelet components of the two signals provides a more realistic comparison than comparing their distance directly.…”
Section: Methodsmentioning
confidence: 99%
“…Low memory per core ratios require an efficient management of trace data, which has been previously addressed by OTF2 [14], cCCG [15], a study of reduction techniques [16], and ScalaTrace [17]. While the latter three approaches are capable of reducing the trace data to a nearly constant trace size (depending on the granularity of the aggregation) they may be very time consuming 1 .…”
Section: Related Workmentioning
confidence: 99%
“…In the section thereafter, we describe our event data compression and reduction management within a fixed-size memory footprint. 1 The approaches in [16] and [17] do not give results for runtime overheads.…”
Section: Infrastructurementioning
confidence: 99%