2020
DOI: 10.1007/978-3-030-48340-1_23
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting Historical Data: Pruning Autotuning Spaces and Estimating the Number of Tuning Steps

Abstract: Autotuning, the practice of automatic tuning of applications to provide performance portability, has received increased attention in the research community, especially in high performance computing. Ensuring high performance on a variety of hardware usually means modifications to the code, often via different values of a selected set of parameters, such as tiling size, loop unrolling factor or data layout. However, the search space of all possible combinations of these parameters can be large, which can result… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
1
1

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 20 publications
3
2
0
Order By: Relevance
“…As we can see in the tables, both methods are able to significantly reduce the size of the state space, while sacrificing only a few percent of performance in most cases. On GPU, the two pruning strategies usually result in very different search spaces-in fact, the results of the conservative pruning strategy are very similar to the results of the naive pruning method from our previous work, 16 since in most cases, the parameters were pruned based on low mutual information rather than performance degradation.…”
Section: Evaluation Of Pruning Space Methodssupporting
confidence: 69%
See 3 more Smart Citations
“…As we can see in the tables, both methods are able to significantly reduce the size of the state space, while sacrificing only a few percent of performance in most cases. On GPU, the two pruning strategies usually result in very different search spaces-in fact, the results of the conservative pruning strategy are very similar to the results of the naive pruning method from our previous work, 16 since in most cases, the parameters were pruned based on low mutual information rather than performance degradation.…”
Section: Evaluation Of Pruning Space Methodssupporting
confidence: 69%
“…On GPU, the two pruning strategies usually result in very different search spaces—in fact, the results of the conservative pruning strategy are very similar to the results of the naive pruning method from our previous work, 16 since in most cases, the parameters were pruned based on low mutual information rather than performance degradation.…”
Section: Tuning Space Pruningsupporting
confidence: 68%
See 2 more Smart Citations
“…On the other hand, our model does not need to be trained on multiple inputs and GPUs. In our previous work [26], we used data measured on one hardware to prune dimensions on different hardware. While this approach works well for speeding up exhaustive search, it brings no advantage when coupled with a searcher based on mathematical optimization.…”
Section: Methods Using Historical Datamentioning
confidence: 99%