Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2005
DOI: 10.1016/j.camwa.2004.12.014
|View full text |Cite
|
Sign up to set email alerts
|

One-Dimensional global optimization for observations with noise

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
21
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 34 publications
(21 citation statements)
references
References 7 publications
0
21
0
Order By: Relevance
“…On the other pole are problems where information about properties of the objective functions is very scarce, and their derivatives are not available. Especially difficult are problems with "noisy" objective functions [11], the values of which are e.g., computed by Monte Carlo methods. In such so called blackbox optimization case, methods based on statistical models of objective functions seem most appropriate [12].…”
Section: Statement Of the Relevant Optimization Problemmentioning
confidence: 99%
“…On the other pole are problems where information about properties of the objective functions is very scarce, and their derivatives are not available. Especially difficult are problems with "noisy" objective functions [11], the values of which are e.g., computed by Monte Carlo methods. In such so called blackbox optimization case, methods based on statistical models of objective functions seem most appropriate [12].…”
Section: Statement Of the Relevant Optimization Problemmentioning
confidence: 99%
“…The one-step optimality criterion prevails in the development of optimal algorithms of global optimization, see e.g. [2,8,11,18,19,23]. The concept of worst-case optimality, which is a standard in the theory of algorithms [1], is implemented e.g.…”
Section: Introductionmentioning
confidence: 99%
“…Although the objective functions are often Lipschitz-continuous (see, e. g., [7,10,16,17,19]), it has very high Lipschitz constants which increase with N, the number of observations. Adding noise to the observed data increases the complexity of the objective function (see, e. g., [3,20]) and moves the global minimizer away from the vector of true parameters. Thus, efficient global optimization techniques should be used to tackle the stated problem.…”
Section: Statement Of the Problemmentioning
confidence: 99%