2013 IEEE Congress on Evolutionary Computation 2013
DOI: 10.1109/cec.2013.6557774
|View full text |Cite
|
Sign up to set email alerts
|

Improving genetic programming based symbolic regression using deterministic machine learning

Abstract: Abstract-Symbolic regression (SR) is a well studied method in genetic programming (GP) for discovering free-form mathematical models from observed data. However, it has not been widely accepted as a standard data science tool. The reluctance is in part due to the hard to analyze random nature of GP and scalability issues. On the other hand, most popular deterministic regression algorithms were designed to generate linear models and therefore lack the flexibility of GP based SR (GP-SR). Our hypothesis is that h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
26
0
3

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 60 publications
(34 citation statements)
references
References 22 publications
1
26
0
3
Order By: Relevance
“…By taking advantage of the state-of-the-art deterministic regression methods, our algorithm aims to ease the burden of feature extraction on GP-SR and help it excel in model building, especially when higher order interactions between the variables exist. We have shown on a synthetic data suite with up to fourth order variable interactions that our technique significantly improves the performance of GP-SR as the data dimensionality increases ( Icke and Bongard (2013)). …”
Section: Related Workmentioning
confidence: 91%
“…By taking advantage of the state-of-the-art deterministic regression methods, our algorithm aims to ease the burden of feature extraction on GP-SR and help it excel in model building, especially when higher order interactions between the variables exist. We have shown on a synthetic data suite with up to fourth order variable interactions that our technique significantly improves the performance of GP-SR as the data dimensionality increases ( Icke and Bongard (2013)). …”
Section: Related Workmentioning
confidence: 91%
“…The authors verified FFX on a broad set of realworld problems with different numbers of variables ranging from 13 to 1468. Later, Icke and Bongard [10] hybridised FFX and GP to create an improved learner for symbolic regression problems. In their work, the authors showed that a hybrid deterministic/GP for symbolic regression outperforms GP alone and several state-of-the-art deterministic regression techniques alone on a set of multivariate polynomial symbolic regression tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Later, Icke and Bongard [5] hybridised FFX and GP to create an improved learner for symbolic regression problems. In this work, the authors showed that a hybrid deterministic/GP for symbolic regression outperforms GP alone and several stateof-the-art deterministic regression techniques alone on a set of multivariate polynomial symbolic regression tasks.…”
Section: Related Workmentioning
confidence: 99%
“…The total number of test problems is 20 (i.e., 5 test functions ⇥4 different variables sizes). For all test problems, we randomly generated three disjoint sets; a training set of 100 points, a validation set of 50 points, and a testing set of 150 points from the interval [ 5,5]. All techniques have been compared based on the average of absolute errors on the testing set.…”
Section: Experimental Settingsmentioning
confidence: 99%