2018
DOI: 10.14311/nnw.2018.28.007
|View full text |Cite
|
Sign up to set email alerts
|

Genetic Programming With Either Stochastic or Deterministic Constant Evaluation

Abstract: Constant evaluation is a key problem for symbolic regression, one solved by means of genetic programming. For constant evaluation, other evolutionary methods are often used. Typical examples are some variants of genetic programming or evolutionary systems, all of which are stochastic. The article compares these methods with a deterministic approach using exponentiated gradient descent. All the methods were tested on single sample function to maintain the same conditions and results are presented in graphs. Fin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…Raidl and Gunther in [11] introduced HGP (hybrid genetic programming), added weights to the top-level tree members and optimized them using a robust least squares method. For example, gradient descent [12,13], simulated annealing combined with the simplex method [14], particle swarm optimization (PSO) [15], multiple regression in the STROGANOFF method [16,17], evolutionary strategies [13,18,19], genetic algorithms [13], self-organizing migrating algorithm (SOMA) [13,20], the Bison Seeker algorithm [21], and non-linear optimization using the Levenberg-Marquardt algorithm [22,23] can be used to optimize the constants. There are many modern approaches for GP optimization.…”
Section: Optimization Of Genetic Programming and Symbolic Regressionmentioning
confidence: 99%
“…Raidl and Gunther in [11] introduced HGP (hybrid genetic programming), added weights to the top-level tree members and optimized them using a robust least squares method. For example, gradient descent [12,13], simulated annealing combined with the simplex method [14], particle swarm optimization (PSO) [15], multiple regression in the STROGANOFF method [16,17], evolutionary strategies [13,18,19], genetic algorithms [13], self-organizing migrating algorithm (SOMA) [13,20], the Bison Seeker algorithm [21], and non-linear optimization using the Levenberg-Marquardt algorithm [22,23] can be used to optimize the constants. There are many modern approaches for GP optimization.…”
Section: Optimization Of Genetic Programming and Symbolic Regressionmentioning
confidence: 99%