2021
DOI: 10.1002/int.22658
|View full text |Cite
|
Sign up to set email alerts
|

Gaussian bare‐bones gradient‐based optimization: Towards mitigating the performance concerns

Abstract: Gradient‐based optimizer (GBO) is a metaphor‐free mathematic‐based algorithm proposed in recent years. Encouraged by the gradient‐based Newton's method, this algorithm combines with population‐based evolutionary methods. The disadvantage of the traditional GBO algorithm is that the global search ability of the algorithm is too strong, and the local search ability is too weak; accordingly, it is difficult to obtain the global optimal solution efficiently. Therefore, a new improved GBO algorithm (GOMGBO) is deve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 174 publications
0
5
0
Order By: Relevance
“…In other words, the GBO as many different optimization algorithms cannot maintain its efficacy to solve all problems (i.e., free lunch theorem [137]). Thus, the researchers conducted some modification to fix the weaknesses of GBO, such as enhancing the speed up the convergence [44], avoiding stuck at local optima [45], the Performance [49], and balancing between exploration and exploitation [138]. These new versions can be used to solve various problems.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In other words, the GBO as many different optimization algorithms cannot maintain its efficacy to solve all problems (i.e., free lunch theorem [137]). Thus, the researchers conducted some modification to fix the weaknesses of GBO, such as enhancing the speed up the convergence [44], avoiding stuck at local optima [45], the Performance [49], and balancing between exploration and exploitation [138]. These new versions can be used to solve various problems.…”
Section: Resultsmentioning
confidence: 99%
“…In [49], the authors solved the weakness of the GBO (i.e., local search mechanism) by introducing a new improved version of GBO, called GOMGBO which includes oppositionbased learning strategy, Gaussian bare-bones strategy and moth spiral strategy. Thirty benchmark functions have been applied to evaluate the performance of GOMGBO.…”
Section: Modificationmentioning
confidence: 99%
“…To further validate IRIME's performance, this study compared it with some advanced algorithms using the IEEE CEC 2017 benchmark test suite. These algorithms include EBOwithCMAR [80], LSHADE_cnEpSi [73], ALCPSO [81], CLPSO [82], LSHADE [83], SADE [84], JADE [85], RCBA [86], EPSO [87], CBA [88], and LWOA [89], The specific experimental data is detailed in Table A6 (Appendix). The results in the table highlight that IRIME, alongside some advanced algorithms, achieves performance near the theoretical optimum in functions such as F1, F3, F6, and F9.…”
Section: Comparison With Advanced Algorithmsmentioning
confidence: 99%
“…A case in point is GBO, which relies on population and gradient information. Despite a new member of MAs, GBO has been applied to many problems [112][113][114] since its introduction.…”
Section: Gradient Based Optimizer (Gbo)mentioning
confidence: 99%