2022
DOI: 10.3390/s22103836
|View full text |Cite
|
Sign up to set email alerts
|

An Evolutionary Field Theorem: Evolutionary Field Optimization in Training of Power-Weighted Multiplicative Neurons for Nitrogen Oxides-Sensitive Electronic Nose Applications

Abstract: Neuroevolutionary machine learning is an emerging topic in the evolutionary computation field and enables practical modeling solutions for data-driven engineering applications. Contributions of this study to the neuroevolutionary machine learning area are twofold: firstly, this study presents an evolutionary field theorem of search agents and suggests an algorithm for Evolutionary Field Optimization with Geometric Strategies (EFO-GS) on the basis of the evolutionary field theorem. The proposed EFO-GS algorithm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 82 publications
0
4
0
Order By: Relevance
“…Developing a metaheuristic method typically involves two steps: exploration and exploitation. As mentioned in [9], exploring metaheuristic methods entails studying and evaluating various techniques to assess their suitability for solving a specific problem. Another approach to exploring metaheuristic methods involves experimenting with different variations of a particular algorithm.…”
Section: Literature Review and Problem Statementmentioning
confidence: 99%
See 1 more Smart Citation
“…Developing a metaheuristic method typically involves two steps: exploration and exploitation. As mentioned in [9], exploring metaheuristic methods entails studying and evaluating various techniques to assess their suitability for solving a specific problem. Another approach to exploring metaheuristic methods involves experimenting with different variations of a particular algorithm.…”
Section: Literature Review and Problem Statementmentioning
confidence: 99%
“…Previous studies have identified that one of the reasons for the slow training process of neural networks is the inappropriate number of hidden layers [7]. However, currently, the average standard number of neurons that are set to determine the optimal number of hidden layers in a model still varies [4,[8][9][10].…”
Section: Introductionmentioning
confidence: 99%
“…These algorithms are mainly based on the evolutionary process of organisms in the natural world and adopt the "survival of the fittest" theory to realize the optimization of the search space. Cooperative co-evolutionary algorithms (CCEA) 22 , evolutionary mating algorithms (EMA) 23 , evolutionary field optimization algorithms (EFO) 24 , and quantum-based avian navigation optimizer algorithm (QANA) 25 belong to this type of algorithm. The main search mechanisms of such algorithms are crossover and mutation.…”
Section: Introductionmentioning
confidence: 99%
“…The study [12] introduces the evolutionary field theorem of search agents and proposes the Evolutionary Field Optimization with Geometric Strategies (EFO-GS) algorithm to improve the quality evolutionary searches. The study also modifies the multiplicative neuron model to develop Power-Weighted Multiplicative (PWM) neural models, which can better represent polynomial nonlinearity and operate in different modes.…”
mentioning
confidence: 99%