2012
DOI: 10.1016/j.amc.2011.07.051
|View full text |Cite
|
Sign up to set email alerts
|

On strong homogeneity of two global optimization algorithms based on statistical models of multimodal objective functions

Abstract: The implementation of global optimization algorithms, using the arithmetic of infinity, is considered. A relatively simple version of implementation is proposed for the algorithms that possess the introduced property of strong homogeneity. It is shown that the P-algorithm and the one-step Bayesian algorithm are strongly homogeneous.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 42 publications
(33 citation statements)
references
References 15 publications
0
33
0
Order By: Relevance
“…(see [8,9,14,22,30,32,34,35,36,37,39,40,41,44,48,49]). It is important to emphasize that the new numeral system avoids situations of the type (5)- (7) providing results ensuring that if a is a numeral written in this system then for any a (i.e., a can be finite, infinite, or infinitesimal) it follows a + 1 > a.…”
Section: The Grossone Methodologymentioning
confidence: 99%
See 1 more Smart Citation
“…(see [8,9,14,22,30,32,34,35,36,37,39,40,41,44,48,49]). It is important to emphasize that the new numeral system avoids situations of the type (5)- (7) providing results ensuring that if a is a numeral written in this system then for any a (i.e., a can be finite, infinite, or infinitesimal) it follows a + 1 > a.…”
Section: The Grossone Methodologymentioning
confidence: 99%
“…The new methodology has been successfully applied for studying a number of applications: percolation (see [14,44]), Euclidean and hyperbolic geometry (see [22,30]), fractals (see [32,34,41,44]), numerical differentiation and optimization (see [8,35,39,49]), infinite series (see [36,40,48]), the first Hilbert problem (see [37]), and cellular automata (see [9]). …”
Section: Introductionmentioning
confidence: 99%
“…As shown in [41] DIRECT is not strongly homogeneous algorithm. In [22] the properties of DIRECT related to the scaling of the objective function is investigated once again.…”
Section: Modifications and Applications Of The Direct Algorithmmentioning
confidence: 99%
“…One of the desirable properties of global optimization methods (see [7,35,41]) is their strong homogeneity meaning that a method produces the same sequences of trial points (i.e., points where the objective function f (x) is evaluated) independently of both shifting f (x) vertically and its multiplication by a scaling constant. In other words, it can be useful to optimize a scaled function g(x) = g(x; α, β) = αf (x) + β, α > 0,…”
Section: Introductionmentioning
confidence: 99%
“…Due to the presence of multiple local minima and non-differentiability of the objective function, classical local optimization techniques cannot be used for solving these problems and global optimization methods should be developed (see, e.g., [8,14,18,19,30,35,36,37,39,41]). …”
Section: Introductionmentioning
confidence: 99%