2009
DOI: 10.1016/j.ins.2008.12.009
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic population variation in genetic programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(7 citation statements)
references
References 53 publications
0
7
0
Order By: Relevance
“…We must emphasize that we have a variable size population, allowing us to obtain a small set of rules. In [51,52], studies on the use of variable size populations in GP are done. It uses a two level hierarchical inference process because it learns two different types of rules: primary rules, which are strong and general rules generated by the genetic operators, and secondary rules, which are weaker and more specific rules, generated after the token competition procedure in order to increase the diversity in the population.…”
Section: Gp-coach Algorithmmentioning
confidence: 99%
“…We must emphasize that we have a variable size population, allowing us to obtain a small set of rules. In [51,52], studies on the use of variable size populations in GP are done. It uses a two level hierarchical inference process because it learns two different types of rules: primary rules, which are strong and general rules generated by the genetic operators, and secondary rules, which are weaker and more specific rules, generated after the token competition procedure in order to increase the diversity in the population.…”
Section: Gp-coach Algorithmmentioning
confidence: 99%
“…These characteristics convert these algorithms into a paradigm of growing interest both for obtaining classification rules [36,56], and for other tasks related to prediction, such as feature selection [19,41] and the generation of discriminant functions [18,31]. There are other algorithms that use the GP paradigm to evolve rule sets for different classification problems that are both two-class [52,64], and multiple-class [42,71] showing that GP is a mature field that efficiently achieves low error rates in supervised learning and is still introducing improvements into its methods [32]. These results suggest that it would be interesting to adapt this paradigm to multiple instance learning and check its performance.…”
Section: G3p-mimentioning
confidence: 99%
“…Increasing the mutation probability when the population is tightly grouped is a way of avoiding premature convergence to a local extreme. The standard GA operators (cross-over and a fixed mutation rate) might have sufficed to sample the search space, but new strategies for varying the population and managing its evolution are being tried in the literature in order to improve the behaviour of the algorithm [13,14]. In this particular application, we determined that a variable mutation rate provides more rapid convergence than a fixed mutation rate, and arrives at a better solution.…”
Section: Ga Operatorsmentioning
confidence: 99%
“…3 (for a constant mutation rate) and Fig. 4 (for a variable mutation rate) depict the evolution of the fitness function distribution (13) in successive generations. The variable mutation rate yields more rapid convergence and produce a better solution.…”
Section: Low Earth Orbitmentioning
confidence: 99%