2017
DOI: 10.1007/978-3-319-62398-6_9
|View full text |Cite
|
Sign up to set email alerts
|

Combining Filter Method and Dynamically Dimensioned Search for Constrained Global Optimization

Abstract: In this work we present an algorithm that combines the filter technique and the dynamically dimensioned search (DDS) for solving nonlinear and nonconvex constrained global optimization problems. The DDS is a stochastic global algorithm for solving bound constrained problems that in each iteration generates a randomly trial point perturbing some coordinates of the current best point. The filter technique controls the progress related to optimality and feasibility defining a forbidden region of points refused by… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0
1

Year Published

2019
2019
2020
2020

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(15 citation statements)
references
References 33 publications
(51 reference statements)
0
14
0
1
Order By: Relevance
“…DDS is a point-to-point stochastic-based heuristic global search algorithm with no parameter tuning; global solutions are obtained by scaling within a user-specified maximum number of function evaluations (MaxIter) [43]. Since it is a simple model that is easily programmed and a global search algorithm, many researchers have focused their great attention on it.…”
Section: Dds Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…DDS is a point-to-point stochastic-based heuristic global search algorithm with no parameter tuning; global solutions are obtained by scaling within a user-specified maximum number of function evaluations (MaxIter) [43]. Since it is a simple model that is easily programmed and a global search algorithm, many researchers have focused their great attention on it.…”
Section: Dds Algorithmmentioning
confidence: 99%
“…As the number of iterations approached the maximum, the algorithm evolved into a local search. e key idea for the DDS algorithm to transit from a global search to a local search is to dynamically and probabilistically reducing the number of dimensions to be perturbed in the neighborhood of the current best solution [11,43]. e operation to dynamically and probabilistically reduce the number of dimensions to be perturbed can be summarized as follows: in each iteration, the jth variable is randomly selected with the probability P t from m decision variables for inclusion in the neighborhood I perturb .…”
Section: Dds Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…end for (14) for j � 1 to n do (15) if (19) end if (20) end if (21) if (25) end if (26) end if (27) end for (28) Evaluate…”
Section: E Golden Section (Gs) Can Guide the Solutions To Search For unclassified
“…is special search mechanism of the DDS algorithm is achieved by dynamically and probabilistically reducing the number of dimensions in the neighborhood [7]. Different versions of DDS have been proposed and successfully applied to practical engineering optimization problems such as the hybrid discrete dynamically dimensioned search (HD-DDS) which was used to solve discrete, single-objective, constrained water distribution system (WDS) design problems [8], the modified dynamically dimensioned search (MDDS) which was presented to optimize the parameter for distributed hydrological model [16], the DDS algorithm which was used to automate the calibration process of an unsteady river flow model [17], the Pareto archived dynamically dimensioned search (PA-DDS) which was applied for multi-objective optimization [18], and the combining filter method and dynamically dimensioned search which was designed for constrained global optimization problems [19]. Although the DDS algorithm partly overcomes the common drawback of single solution-based search algorithms to some extent, it does not still provide an ideal solution to address the poor and slow convergence of the global optimum in the best case or an acceptable local optimum in the worst case completely.…”
Section: Introductionmentioning
confidence: 99%