Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600)
DOI: 10.1109/cec.2002.1004435
|View full text |Cite
|
Sign up to set email alerts
|

Solving the symbolic regression problem with tree-adjunct grammar guided genetic programming: the comparative results

Abstract: In this paper, we show some experimental results of tree-adjunct grammar guided genetic programming [6] (TAG3P) on the symbolic regression problem, a benchmark problem in genetic programming. We compare the results with genetic programming [9] (GP) and grammar guided genetic programming [14] (GGGP). The results show that TAG3P significantly outperforms GP and GGGP on the target functions attempted in terms of probability of success. Moreover, TAG3P still performed well when the structural complexity of the tar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
31
0

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 48 publications
(32 citation statements)
references
References 4 publications
1
31
0
Order By: Relevance
“…TAGs have shown promise in the field of Genetic Programming (GP) [5,6,7,17] as well as other fields in natural computing [1]. This promise carried over when TAGs were incorporated into GE, i.e., Tree-Adjunct Grammatical Evolution (TAGE), in the form of an increased ability to find fitter solutions in fewer generations and an increased success rate [16].…”
Section: Introductionmentioning
confidence: 99%
“…TAGs have shown promise in the field of Genetic Programming (GP) [5,6,7,17] as well as other fields in natural computing [1]. This promise carried over when TAGs were incorporated into GE, i.e., Tree-Adjunct Grammatical Evolution (TAGE), in the form of an increased ability to find fitter solutions in fewer generations and an increased success rate [16].…”
Section: Introductionmentioning
confidence: 99%
“…In order to compare the performance of the proposed BBP with other algorithms, a set of 10 real-valued symbolic regression problems, described in [7,15,[34][35][36], was used. These problems are grouped into three categories: polynomial functions, trigonometric, logarithm and square root functions and bivariate functions which are shown in Table 1.…”
Section: Resultsmentioning
confidence: 99%
“…Let us state the brief formulation of the symbolic regression problem [5]. g It is required to find such a superposition, which would provide the maximum (minimum) of the given functional ( , ).…”
Section: Problem Formulationmentioning
confidence: 99%