Proceedings of the Genetic and Evolutionary Computation Conference Companion 2017
DOI: 10.1145/3067695.3076055
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic GP fitness cases in static and dynamic optimisation problems

Abstract: In Genetic Programming (GP), the tness of individuals is normally computed by using a set of tness cases (FCs). In this work, we use the whole FCs set, but rather than adopting the commonly used GP approach of presenting the entire set to the system from the beginning of the search, referred as static FCs, we allow the GP system to build it by aggregation over time, named as dynamic FCs, with the hope to make the search more amenable. Results on eight symbolic regression functions indicate that the proposed ap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 1 publication
(2 reference statements)
0
3
0
Order By: Relevance
“…The range of problem domains for GP are wide and this form of EA has been found to be beneficial for problems with multiple local optima and for problems with a varying degree of complexity [9], making EAs ideal for highly complex problems including the automatic configuration of deep neural networks' architectures and their training (an indepth recent literature review in this emerging research area can be found in [8]). However despite the well documented effectiveness of canonical GP, there are well-known limitations of these methods, through the study of properties of encodings [17], [18], and research is on going into finding and developing approaches to improve their overall performance including promoting neutrality in deceptive and challenging landscapes [16], [20], [38], [39], dynamic fitness cases [23], [24], reuse of code [21], variants of GP [13], use of surrogate models [27], [28] to mention a few examples. Successful examples of these improvements can be found in applicable areas including energy-based problems [15], [22].…”
Section: Related Workmentioning
confidence: 99%
“…The range of problem domains for GP are wide and this form of EA has been found to be beneficial for problems with multiple local optima and for problems with a varying degree of complexity [9], making EAs ideal for highly complex problems including the automatic configuration of deep neural networks' architectures and their training (an indepth recent literature review in this emerging research area can be found in [8]). However despite the well documented effectiveness of canonical GP, there are well-known limitations of these methods, through the study of properties of encodings [17], [18], and research is on going into finding and developing approaches to improve their overall performance including promoting neutrality in deceptive and challenging landscapes [16], [20], [38], [39], dynamic fitness cases [23], [24], reuse of code [21], variants of GP [13], use of surrogate models [27], [28] to mention a few examples. Successful examples of these improvements can be found in applicable areas including energy-based problems [15], [22].…”
Section: Related Workmentioning
confidence: 99%
“…In this work, we use a distance, studied in the first author's works [1,3], that accounts for pairwise disagreements between two lists of ranked fitness values. We hope that these disagreements can inform us on whether an evolved population is useful in the face of a change.…”
Section: Kendall Tau Distancementioning
confidence: 99%
“…including promoting neutrality in deceptive landscapes [25], [26], dynamic fitness cases [16], [17], to mention a few examples.…”
Section: Introductionmentioning
confidence: 99%