2017
DOI: 10.1007/s10898-017-0555-4
|View full text |Cite
|
Sign up to set email alerts
|

Solving a set of global optimization problems by the parallel technique with uniform convergence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(9 citation statements)
references
References 21 publications
0
9
0
Order By: Relevance
“…There exist a lot of algorithms for solving global optimization problems (see, e.g., [1,2,14] for parallel optimization, [9,13] for dimensionality reduction schemes, [10,11,25] for numerical solution of real-life optimization problems, [21] for simplicial optimization methods, [15,35,36] for stochastic optimization methods, [16,17,22,32] for univariate Lipschitz global optimization, [23] for interval branch-and-bound methods, etc. Among them, there can be distinguished two groups of algorithms: nature-inspired metaheuristic algorithms (as, for instance, genetic algorithm, firefly algorithm, particle swarm optimization, etc.…”
Section: Statement Of the Optimization Problemmentioning
confidence: 99%
“…There exist a lot of algorithms for solving global optimization problems (see, e.g., [1,2,14] for parallel optimization, [9,13] for dimensionality reduction schemes, [10,11,25] for numerical solution of real-life optimization problems, [21] for simplicial optimization methods, [15,35,36] for stochastic optimization methods, [16,17,22,32] for univariate Lipschitz global optimization, [23] for interval branch-and-bound methods, etc. Among them, there can be distinguished two groups of algorithms: nature-inspired metaheuristic algorithms (as, for instance, genetic algorithm, firefly algorithm, particle swarm optimization, etc.…”
Section: Statement Of the Optimization Problemmentioning
confidence: 99%
“…a currently available solution obtained by engineers using practical reasons, global optimization problems are considered (see [1,10,13,17,18,28,34,40,42,43,44]). Unfortunately, very often a practical global optimization process is performed under a limited budget, i.e., the number of allowed evaluations of the objective function f (x) is fixed a priori and is not very high (see a detailed discussion in [37]) requiring so an accurate development of fast global optimization methods (see, e.g., [2,3,15,23,27,34,40,44]).…”
Section: Introductionmentioning
confidence: 99%
“…Since Lipschitz continuity is a quite natural assumption for applied problems (specifically for technical systems, see, e.g., [14,17,21,28,34,40]), we consider here objective functions that satisfy the Lipschitz condition over the search domain D. Thus, our problem becomes Lipschitz global optimization problem broadly studied in the literature (see, e.g., [2,12,14,15,18,21,28,33,34,40] and references given therein). In this paper, we propose a new Lipschitz-based safe global optimization algorithm, specifically designed to work in the noisy setting.…”
Section: Introductionmentioning
confidence: 99%
“…• In the article [2], the authors consider solving a set of global optimization problems in parallel. It is shown that the algorithm they propose provides a uniform convergence to the set of solutions for all problems treated simultaneously.…”
mentioning
confidence: 99%