2019
DOI: 10.1080/03610918.2019.1697451
|View full text |Cite
|
Sign up to set email alerts
|

Diverse classifiers ensemble based on GMDH-type neural network algorithm for binary classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 21 publications
0
4
0
Order By: Relevance
“…The GMDH has been successfully used for computer-based mathematical modeling in complex systems, as well as data mining, optimization, and pattern recognition problems [72]. The process of GMDH is similar to a type of self-organizing network [73,74]. The mapping between the input and output variables in a GMDH neural network is a nonlinear function.…”
Section: Group Methods Of Data Handling-type Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…The GMDH has been successfully used for computer-based mathematical modeling in complex systems, as well as data mining, optimization, and pattern recognition problems [72]. The process of GMDH is similar to a type of self-organizing network [73,74]. The mapping between the input and output variables in a GMDH neural network is a nonlinear function.…”
Section: Group Methods Of Data Handling-type Neural Networkmentioning
confidence: 99%
“…Finally, in the convergence process of the algorithm, if the results in the layer (n + 1) are better than the layer (n), then the algorithm converges. Equations ( 1) and ( 2) indicate the relationship between the approximate function ( ∧ f ) with the multi-input and single-output ( ∧ y) dataset and the least possible error between actual and predicted values [73,74].…”
Section: Group Methods Of Data Handling-type Neural Networkmentioning
confidence: 99%
“…Finally, in the convergence process of the algorithm, if the results in the layer (n+1) are better than the layer (n), then the algorithm converges. Eqs 1 and 2 indicate the relationship among the approximate function ( f ) with the multi-input and single-output ( y  ) dataset and the least possible error between actual and predicted values [70][71].…”
Section: Group Methods Of Data Handling-type Neural Networkmentioning
confidence: 99%
“…The models are based upon the dataset and they are self-regulating models. Equation (7) shows the general form of the GMDH basic neural network map according to input and output data [45]:…”
Section: Group Methods Of Data Handling (Gmdh)mentioning
confidence: 99%