2021
DOI: 10.3390/sym13091706
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Equilibrium Optimizer Algorithm and Its Application in LSTM Neural Network

Abstract: An improved equilibrium optimizer (EO) algorithm is proposed in this paper to address premature and slow convergence. Firstly, a highly stochastic chaotic mechanism is adopted to initialize the population for range expansion. Secondly, the capability to conduct global search to jump out of local optima is enhanced by assigning adaptive weights and setting adaptive convergence factors. In addition 25 classical benchmark functions are used to validate the algorithm. As revealed by the analysis of the accuracy, s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 36 publications
0
4
0
Order By: Relevance
“…Optimization and meta-heuristic algorithms are currently two of the hottest topics in computer science due to their presence in several domains, such as feature selection problems [10][11][12], facial recognition [13,14], opinion mining [15,16], the identification of parameters in photovoltaic applications [17,18], economic load dispatch problems [19,20], bin packing problems [21,22], software cost estimations [23], traveling salesman problems [24], constrained engineering problems [25], and continuous optimization problems [26,27]. According to the no free lunch (NFL) theory [28], no algorithm can discover the optimal solution to all problems; hence, numerous optimization approaches exist in the literature. In other words, if an algorithm can determine the optimal answer for a particular problem, it will fail for other types.…”
Section: Introductionmentioning
confidence: 99%
“…Optimization and meta-heuristic algorithms are currently two of the hottest topics in computer science due to their presence in several domains, such as feature selection problems [10][11][12], facial recognition [13,14], opinion mining [15,16], the identification of parameters in photovoltaic applications [17,18], economic load dispatch problems [19,20], bin packing problems [21,22], software cost estimations [23], traveling salesman problems [24], constrained engineering problems [25], and continuous optimization problems [26,27]. According to the no free lunch (NFL) theory [28], no algorithm can discover the optimal solution to all problems; hence, numerous optimization approaches exist in the literature. In other words, if an algorithm can determine the optimal answer for a particular problem, it will fail for other types.…”
Section: Introductionmentioning
confidence: 99%
“…LSTM mainly improves the hidden layer of RNN. LSTM network model adds a cell state to the hidden layer of RNN for long-term preservation of information [24][25][26].…”
Section: Neural Network Of Lstmmentioning
confidence: 99%
“…Further, experimental results clearly highlight that proposed E 2 O method provides superior performance when compared with EO for Engineering problems. Improved Equilibrium Optimizer (IEO), a modified version of EO introduced by Lan et al in 2021 [149], uses chaotic mechanisms for population range expansion, adaptive weights, and an adaptive convergence factor to prevent becoming caught in local optima, improving the power of standard EO. Additionally, experimental results demonstrate that the suggested IEO approach for the LSTM Neural Network outperforms PSO, GWO, SOA, WOA, CSA, MPA, COA, CPA, TSA, and GA.…”
Section: Chaotic Equilibrium Optimizermentioning
confidence: 99%