2019
DOI: 10.48550/arxiv.1902.01894
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Generalized Framework for Population Based Training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…To mitigate the sensitivity of experimental results to empirical, and perhaps arbitrary, choices of hyperparameters, we present additional results that leverage Population Based Training (PBT) [26,37] which is a simple asynchronous optimisation algorithm that jointly optimize a population of models and their hyperparameters. In particular, PBT discovers a per-epoch schedule of hyperparameter settings rather than a static fixed configuration used over the entirety of training.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To mitigate the sensitivity of experimental results to empirical, and perhaps arbitrary, choices of hyperparameters, we present additional results that leverage Population Based Training (PBT) [26,37] which is a simple asynchronous optimisation algorithm that jointly optimize a population of models and their hyperparameters. In particular, PBT discovers a per-epoch schedule of hyperparameter settings rather than a static fixed configuration used over the entirety of training.…”
Section: Methodsmentioning
confidence: 99%
“…We find, as in other studies, that this joint optimization of hyperparameter schedules typically results in faster wall-clock convergence and higher final performance. [25,37] Baselines. To best interpret the effectiveness of RTE, we compare our results to many techniques for learning with noise (Table 1).…”
Section: Methodsmentioning
confidence: 99%