2012
DOI: 10.1016/j.neucom.2012.01.024
|View full text |Cite
|
Sign up to set email alerts
|

An evolutionary constructive and pruning algorithm for artificial neural networks and its prediction applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

1
29
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(31 citation statements)
references
References 39 publications
1
29
0
Order By: Relevance
“…An ANN can be taught by training it using on hand cases with inputs and expected outputs. Due to the learning ability and nonlinearity method of ANNs [1], they have been used extensively in many applications of data mining such as classification, associate rule algorithm for ANN training was proposed in [9]. However, most of these strategies need the chromosome length to be predefined.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…An ANN can be taught by training it using on hand cases with inputs and expected outputs. Due to the learning ability and nonlinearity method of ANNs [1], they have been used extensively in many applications of data mining such as classification, associate rule algorithm for ANN training was proposed in [9]. However, most of these strategies need the chromosome length to be predefined.…”
Section: Introductionmentioning
confidence: 99%
“…In the literature a few number of methods [1,26,29] have been applied for optimizing both weights and structure of neural network simultaneously. It is the significance of the research in this area to find a superior solution for the neural network model.…”
Section: Introductionmentioning
confidence: 99%
“…Many researchers have recently focused their efforts on new ways to optimize the architecture of artificial neural networks, including the methods of pruning [21,22], construction [23] and evolutionary algorithms [24][25][26]. Constructing algorithms start training with a small network and incrementally add hidden nodes during training when the network cannot reduce the training error.…”
Section: Introductionmentioning
confidence: 99%
“…In many cases one can achieve much higher performance while the system complexity is only slightly increased (Tallon-Ballesteros and Hervas-Martinez, 2011). The most popular hybrid systems include evolutionary-neural (Font et al, 2010;Chandra et al, 2011;Su et al, 2011;Tong and Schierz, 2011;Yang and Chen, 2012;Zhang et al, 2011), evolutionary-fuzzy (Cheng et al, 2010;Lin and Chen, 2011;Antonelli et al, 2009;Cheshmehgaz et al, 2012;Aydogan et al, 2012) and neuro-fuzzy systems (Shahlaei et al, 2012;Czogała and Łęski, 2000;Tadeusiewicz, 2010b;Tadeusiewicz and Morajda, 2012).…”
Section: Introductionmentioning
confidence: 99%