2022
DOI: 10.1007/s11831-022-09872-y
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-Based Optimizer (GBO): A Review, Theory, Variants, and Applications

Abstract: This paper introduces a comprehensive survey of a new population-based algorithm so-called gradient-based optimizer (GBO) and analyzes its major features. GBO considers as one of the most effective optimization algorithm where it was utilized in different problems and domains, successfully. This review introduces set of related works of GBO where distributed into; GBO variants, GBO applications, and evaluate the efficiency of GBO compared with other metaheuristic algorithms. Finally, the conclusions concentrat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(5 citation statements)
references
References 132 publications
0
5
0
Order By: Relevance
“…Nevertheless, the current work demonstrates that the emergent metastability in an interneuron population depends on their connectivity with excitatory neurons and the frequency of the excitatory drive. Their relationships are revealed in the curves that show robust gradients ( Daoud et al, 2023 ) along the dimensions excitatory-inhibitory connection probability ( Figure 4A ) and the frequency of the excitatory drive ( Figures 3 , 4B ). Thus, a numerical optimization using as an objective function can, in principle, estimate optimal intraregional connectivity configurations among groups of neurons with heterogenous frequency profiles.…”
Section: Discussionmentioning
confidence: 94%
See 1 more Smart Citation
“…Nevertheless, the current work demonstrates that the emergent metastability in an interneuron population depends on their connectivity with excitatory neurons and the frequency of the excitatory drive. Their relationships are revealed in the curves that show robust gradients ( Daoud et al, 2023 ) along the dimensions excitatory-inhibitory connection probability ( Figure 4A ) and the frequency of the excitatory drive ( Figures 3 , 4B ). Thus, a numerical optimization using as an objective function can, in principle, estimate optimal intraregional connectivity configurations among groups of neurons with heterogenous frequency profiles.…”
Section: Discussionmentioning
confidence: 94%
“…Our results suggest that the interneuron population’s emergent attractors depend primarily on the E-I connectivity and the frequency of the driving excitatory oscillations. There are robust gradients ( Daoud et al, 2023 ) towards maximally stable interneuron phase cluster arrangements along the dimensions of pyramidal frequency and E-I connection probability ( Figures 3 , 4 ).…”
Section: Resultsmentioning
confidence: 99%
“…Gradient-Based Optimization methods present several merits in optimization. They exhibit notable advantages, particularly their rapid convergence, especially when dealing with smooth and convex objective functions (Daoud et al, 2023). Moreover, their suitability for high-dimensional problems, coupled with their amenable parallelization, renders them highly efficient in resource-rich computational settings.…”
Section: Gradient-based Algorithms Vs Metaheuristic Algorithms In Opt...mentioning
confidence: 99%
“…The main feature of this class of algorithms is the use of specific mathematical methods as inspiration for building the algorithm. Some of these algorithms are more competitive, but their search mechanisms are still essentially linear combinations or weight-based combinations, such as gradient-based optimizer (GBO) 51 , generalized normal distribution optimization (GNDO) 52 , geometric mean optimizer (GMO) 53 , arithmetic optimization algorithm (AOA) 54 , subtractive averaging base optimizer (SABO) 55 . And some algorithms contribute more unique search mechanisms.…”
Section: Introductionmentioning
confidence: 99%