Proceedings of the 36th ACM SIGPLAN Conference on Programming Language Design and Implementation 2015
DOI: 10.1145/2737924.2737999
|View full text |Cite
|
Sign up to set email alerts
|

Celebrating diversity: a mixture of experts approach for runtime mapping in dynamic environments

Abstract: Matching program parallelism to platform parallelism using thread selection is difficult when the environment and available resources dynamically change. Existing compiler or runtime approaches are typically based on a one-size fits all policy. There is little ability to either evaluate or adapt the policy when encountering new external workloads or hardware resources. This paper focuses on selecting the best number of threads for a parallel application in dynamic environments. It develops a new scheme based o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 25 publications
(11 citation statements)
references
References 29 publications
(35 reference statements)
0
11
0
Order By: Relevance
“…This is because a one-size-fits-all model is unlikely to precisely capture behaviours of diverse applications, and no matter how parameterized the model is, it is highly unlikely that a model developed today will always be suited for tomorrow. To allow the model to adapt to the change of the computing environment and workloads, ensemble learning was exploited in prior works [73], [119], [120]. The idea of ensemble learning is to use multiple learning algorithms, where each algorithm is effective for particular problems, to obtain better predictive performance than could be obtained from any of the constituent learning algorithm alone [121], [122].…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…This is because a one-size-fits-all model is unlikely to precisely capture behaviours of diverse applications, and no matter how parameterized the model is, it is highly unlikely that a model developed today will always be suited for tomorrow. To allow the model to adapt to the change of the computing environment and workloads, ensemble learning was exploited in prior works [73], [119], [120]. The idea of ensemble learning is to use multiple learning algorithms, where each algorithm is effective for particular problems, to obtain better predictive performance than could be obtained from any of the constituent learning algorithm alone [121], [122].…”
Section: Discussionmentioning
confidence: 99%
“…Various forms of program features have been used in compiler-based machine learning. These include static code structures [123] and runtime information such as system load [119], [124] and performance counters [53].…”
Section: A Feature Representationmentioning
confidence: 99%
See 2 more Smart Citations
“…Existing approaches are limited in accuracy and usability. Offline profiling approaches [12,18] can miss contention effects that arise during co-execution. Moreover, they require extensive and lengthy runs with different configurations to build scalability models.…”
mentioning
confidence: 99%