2023
DOI: 10.1007/978-3-031-26504-4_26
|View full text |Cite
|
Sign up to set email alerts
|

Hyper-parameter Optimization Using Continuation Algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…Promising recent approaches include score-function methods with control variates [27], variational optimization (VO) [28], and scale-space-based continuation approaches [29]. Such approaches have been applied in the context of reinforcement learning [27], graph matching [30], hyperparameter optimization [31] and differentiable deep neural network training [28], but there has been limited application of such techniques specifically to GNN architectures. In this work, we introduce an efficient VO approach, based on optimization of an smoothed objective function using a Gaussian scale-space [32].…”
Section: Introductionmentioning
confidence: 99%
“…Promising recent approaches include score-function methods with control variates [27], variational optimization (VO) [28], and scale-space-based continuation approaches [29]. Such approaches have been applied in the context of reinforcement learning [27], graph matching [30], hyperparameter optimization [31] and differentiable deep neural network training [28], but there has been limited application of such techniques specifically to GNN architectures. In this work, we introduce an efficient VO approach, based on optimization of an smoothed objective function using a Gaussian scale-space [32].…”
Section: Introductionmentioning
confidence: 99%