2017
DOI: 10.1016/j.asoc.2016.09.035
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble of heterogeneous flexible neural trees using multiobjective genetic programming

Abstract: Machine learning algorithms are inherently multiobjective in nature, where approximation error minimization and model's complexity simplification are two conflicting objectives. We proposed a multiobjective genetic programming (MOGP) for creating a heterogeneous flexible neural tree (HFNT), tree-like flexible feedforward neural network model. The functional heterogeneity in neural tree nodes was introduced to capture a better insight of data during learning because each input in a dataset possess different fea… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
28
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 26 publications
(30 citation statements)
references
References 79 publications
2
28
0
Order By: Relevance
“…The internal node V is a set of internal (computational) nodes and, the leaf node T is a set of inputs [17]. Hence, an HFNT M G can be expressed as:…”
Section: Multiobjective Heterogeneous Flexible Neural Tree (Hfnt M )mentioning
confidence: 99%
See 3 more Smart Citations
“…The internal node V is a set of internal (computational) nodes and, the leaf node T is a set of inputs [17]. Hence, an HFNT M G can be expressed as:…”
Section: Multiobjective Heterogeneous Flexible Neural Tree (Hfnt M )mentioning
confidence: 99%
“…The developed best HFNT M model using the parameter setting mentioned in [17] and two-fold training is shown in Fig. 1, where the leaf nodes indicate the input features and the root node gives the predicted output UCS of the model.…”
Section: Models Predictionmentioning
confidence: 99%
See 2 more Smart Citations
“…Infact a paradigm, called Neuroevolution that accommodates adaptive learning all or some components of FNN in some intuitive ways by applying EAs. For examples, generalized acquisition of recurrent links (GNARL) [257], evolutionary programming net (EPNet) [258], neuroevolution of augmenting topologies (NEAT) [259], hypercube-based neuroevolution of augmenting topologies (Hyper-NEAT) [260], evolutionary acquisition of neural topologies (EANT2) [261], and heterogeneous flexible neural tree (HFNT) [262] optimizes both FNN structure and parameters (weights) using some direct or indirect encoding methods. Moreover, several other paradigms and methods proposed in the past for the simultaneous optimization of FNN components are described as follows.…”
Section: Learning Algorithm Optimizationmentioning
confidence: 99%