2019
DOI: 10.1007/s10710-019-09371-3
|View full text |Cite
|
Sign up to set email alerts
|

Parameter identification for symbolic regression using nonlinear least squares

Abstract: In this paper we analyze the effects of using nonlinear least squares for parameter identification of symbolic regression models and integrate it as local search mechanism in tree-based genetic programming. We employ the Levenberg-Marquardt algorithm for parameter optimization and calculate gradients via automatic differentiation. We provide examples where the parameter identification succeeds and fails and highlight its computational overhead. Using an extensive suite of symbolic regression benchmark problems… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
45
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 75 publications
(55 citation statements)
references
References 45 publications
4
45
0
Order By: Relevance
“…We use a tree-based GP variation with optional memetic local optimization of SR model parameters which has produced good results for a diverse set of regression benchmark problems (Kommenda et al, 2020).…”
Section: Genetic Programming For Shape-constrained Symbolic Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…We use a tree-based GP variation with optional memetic local optimization of SR model parameters which has produced good results for a diverse set of regression benchmark problems (Kommenda et al, 2020).…”
Section: Genetic Programming For Shape-constrained Symbolic Regressionmentioning
confidence: 99%
“…Procedure Optimize locally improves the vector of numerical coefficients θ ∈ R dim of each model using non-linear least-squares fitting using the Levenberg-Marquardt algorithm. It has been demonstrated that gradient-based local improvement improves symbolic regression performance (Topchy and Punch, 2001;Kommenda et al, 2020). Here we want to investigate whether it can also be used in combination with shape constraints.…”
Section: Optional Local Optimizationmentioning
confidence: 99%
“…Recently (Kommenda et al, 2020) proposed the hybridization of a Genetic Programming (GP) approach by using the Levenberg-Marquardt algorithm for parameter optimization of the regression models and calculated gradients via automatic differentiation; this is well in resonance within the spirit of a memetic computing approach. They used the same benchmark datasets as ours and employed different GP variants.…”
Section: Median Mse Scores Of Top 4 Algorithms For 94 Datasetsmentioning
confidence: 99%
“…However, it is also true that some researchers have been trying to address the need of including individual optimization to existing Genetic Programming approaches, e.g. Cagnoni et al (2005); Azad & Ryan (2014); Ffrancon & Schoenauer (2015); Semenkina & Semenkin (2015); Kommenda et al (2020). While this list is probably not comprehensive, it is recognized that introducing individual optimization steps into EA methods based on current representations for solutions has been a challenge for symbolic regression approaches.…”
Section: Introductionmentioning
confidence: 99%
“…GP is well suited to searching over the space of functions f ∈ F , but is widely regarded as being poor at optimizing . Typical GP operations such as crossover and tree mutation are usually considered unlikely to determine optimal values of , for which sensitivity criteria may require close to the maximum available floating-point precision for good performance.Work on parameter tuning (also known as model calibration) in GP has been comprehensively reviewed in a recent paper by Kommenda et al [17]. Most of the previous GP parameter tuning work has been carried out on regression problems, and this too is the focus of the present paper.…”
Section: Introductionmentioning
confidence: 99%