2018
DOI: 10.1371/journal.pone.0192944
|View full text |Cite
|
Sign up to set email alerts
|

Optimization by Adaptive Stochastic Descent

Abstract: When standard optimization methods fail to find a satisfactory solution for a parameter fitting problem, a tempting recourse is to adjust parameters manually. While tedious, this approach can be surprisingly powerful in terms of achieving optimal or near-optimal solutions. This paper outlines an optimization algorithm, Adaptive Stochastic Descent (ASD), that has been designed to replicate the essential aspects of manual parameter fitting in an automated way. Specifically, ASD uses simple principles to form pro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
7
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 24 publications
(26 citation statements)
references
References 47 publications
0
26
0
Order By: Relevance
“…This second step is repeated (on subsequent iterations, the choice of program based on probability distributions learned from previous iterations) until the solution converges. Further details are provided in [ 32 ].…”
Section: Methodsmentioning
confidence: 99%
“…This second step is repeated (on subsequent iterations, the choice of program based on probability distributions learned from previous iterations) until the solution converges. Further details are provided in [ 32 ].…”
Section: Methodsmentioning
confidence: 99%
“…This process is repeated multiple times using a Monte Carlo initialization to increase the liklihood of locating the global minimum. To perform the optimization, we employ a Bayesian adaptive locally linear stochastic descent algorithm [ 17 ].…”
Section: Methodsmentioning
confidence: 99%
“…However, since a single model run returns a scalar log-likelihood value, these runs can be easily integrated into standardized calibration frameworks. An example implementation using Weights & Biases ( wandb.com ) is included in the codebase, but any standard optimization library -such as the optimization module of SciPy -can be easily adapted, as can more advanced methods such as the adaptive stochastic descend method of the Sciris library (Kerr et al, 2018), or Bayesian approaches such as history matching (Andrianakis et al, 2015).…”
Section: Calibrationmentioning
confidence: 99%