ACM/IEEE SC 2005 Conference (SC'05) 2005
DOI: 10.1109/sc.2005.52
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Parameter Tuning for Applications with Performance Variability

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2007
2007
2016
2016

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 30 publications
(26 citation statements)
references
References 13 publications
0
26
0
Order By: Relevance
“…Empirical Search Using DL/ML Model. This section demonstrates the integration of the analytical bounds with existing search optimization algorithms, the Nelder-Mead Simplex method [22] and the Parallel Rank Ordering (PRO) method [30]. In order to handle boundary constraints due to the DL/ML model, we used the extended version of the PRO algorithm introduced in the Active Harmony framework [32].…”
Section: Search Space Reduction By Dl/ml Modelmentioning
confidence: 99%
“…Empirical Search Using DL/ML Model. This section demonstrates the integration of the analytical bounds with existing search optimization algorithms, the Nelder-Mead Simplex method [22] and the Parallel Rank Ordering (PRO) method [30]. In order to handle boundary constraints due to the DL/ML model, we used the extended version of the PRO algorithm introduced in the Active Harmony framework [32].…”
Section: Search Space Reduction By Dl/ml Modelmentioning
confidence: 99%
“…The algorithm that we use is based on the Parallel Rank Order (PRO) algorithm proposed by Tabatabaee et al [22]. For a function of N variables, PRO maintains a set of K (where K is at least N + 1 and is usually set to the number of cores the harmonized application is run on) points forming the vertices of a simplex in an N -dimensional space.…”
Section: Parameter Tuning Algorithmmentioning
confidence: 99%
“…the performance of two consecutive timesteps is recorded) and the minimum of the two samples is sent to the Active Harmony server. We showed in our previous work [22] that even in the presence of 5% variability due to background noise, taking the minimum of two samples is enough to ensure the convergence of the search algorithm.…”
mentioning
confidence: 99%
“…Average is the most widely used operator to aggregate and estimate "real" performance from multiple samples. As an alternative, we showed, in our earlier work [83], taking a minimum of multiple performance measurements is an effective way for performance estimation even in the presence of heavy-tail component in the performance distribution. The minimum has finite mean and variance and is not heavy-tailed.…”
Section: Case Study: Gs2mentioning
confidence: 99%