2021
DOI: 10.1145/3427093
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Auto-Tuning of Parallel Programs with Interdependent Tuning Parameters via Auto-Tuning Framework (ATF)

Abstract: Auto-tuning is a popular approach to program optimization: it automatically finds good configurations of a program’s so-called tuning parameters whose values are crucial for achieving high performance for a particular parallel architecture and characteristics of input/output data. We present three new contributions of the Auto-Tuning Framework (ATF), which enable a key advantage in general-purpose auto-tuning : efficiently optimizing programs whose tuning parameters have int… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(7 citation statements)
references
References 53 publications
0
7
0
Order By: Relevance
“…MetaTune [35] improves auto-tuning parameter selection for systems like AutoTVM. In ATF [31], the authors look at more efficient auto-tuning techniques, with a focus on modeling when parameters have inter-dependencies among them.…”
Section: Related Workmentioning
confidence: 99%
“…MetaTune [35] improves auto-tuning parameter selection for systems like AutoTVM. In ATF [31], the authors look at more efficient auto-tuning techniques, with a focus on modeling when parameters have inter-dependencies among them.…”
Section: Related Workmentioning
confidence: 99%
“…The last row is the results using autotuning. The best configuration in 200 evaluations has the tile size (50,128,256) for the large dataset and the tile size (64, 50, 256) for the extra-large dataset using GBRT. We observe that autotuning outperforms the other compiling methods to provide the smallest execution time for both datasets.…”
Section: Case Study: Syr2k With Multiple Loop Transformations (Loop T...mentioning
confidence: 99%
“…There exist two kinds of expressions of search space: vector space and tree space. Most of autotuning frameworks present the search space in a vector space, that is, a fixed number of parameter knobs, such as OpenTuner, 42 CLTune, 46 HalideTuner, 44 Orio, 41 KernelTuner, 48 AFT, 45,50 ytopt, 13,24 and so forth. The successor of HalideTuner 51 uses tree search to avoid the limitation of a vector search space, but uses Beam search to explore the space.…”
Section: Related Workmentioning
confidence: 99%
“…There exist two kinds of expressions of search space: vector space and tree space. Most of autotuning frameworks present the search space in a vector space, that is, a fixed number of parameter knobs, such as OpenTuner 42 , CLTune 46 , HalideTuner 44 , Orio 41 , KernelTuner 48 , AFT 45,50 , ytopt 13,24 , etc. The successor of HalideTuner 51 uses tree search to avoid the limitation of a vector search space, but uses Beam search to explore the space.…”
Section: Related Workmentioning
confidence: 99%
“…We classify the existing autotuning frameworks into four categories:1) enumerate all possible parameter configurations, reject invalid ones, and evaluate the valid ones 32 ; 2) enumerate only valid configurations 45,50 ; 3) sample from the set of possible configurations, and reject invalid ones 41,43,10 during the search; 4) sample only valid configurations and search over them 24 . Our ytopt autotuning framework belongs to Category 4 which overcomes the ineffectiveness of Category 3 by generating valid samples and addresses the limitations of Categories 1 and 2, where enumerating all possible configurations can be computationally expensive for large number of parameters.…”
Section: Related Workmentioning
confidence: 99%