2003
DOI: 10.1109/tnn.2003.813832
|View full text |Cite
|
Sign up to set email alerts
|

A constructive algorithm for training cooperative neural network ensembles

Abstract: Abstract-This paper presents a constructive algorithm for training cooperative neural-network ensembles (CNNEs). CNNE combines ensemble architecture design with cooperative training for individual neural networks (NNs) in ensembles. Unlike most previous studies on training ensembles, CNNE puts emphasis on both accuracy and diversity among individual NNs in an ensemble. In order to maintain accuracy among individual NNs, the number of hidden nodes in individual NNs are also determined by a constructive approach… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
126
0

Year Published

2005
2005
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 262 publications
(127 citation statements)
references
References 51 publications
(95 reference statements)
0
126
0
Order By: Relevance
“…A cooperative neural network, called the crosscoupled Hopfield network, for associate memory was developed by Ozawa (1998). A cooperative neural network ensemble algorithm was developed by Islam et al (2003). These cooperative modular neural networks are mainly applied for classification, pattern recognition, and associate memory.…”
Section: Cooperative Modular Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…A cooperative neural network, called the crosscoupled Hopfield network, for associate memory was developed by Ozawa (1998). A cooperative neural network ensemble algorithm was developed by Islam et al (2003). These cooperative modular neural networks are mainly applied for classification, pattern recognition, and associate memory.…”
Section: Cooperative Modular Neural Networkmentioning
confidence: 99%
“…Different from neural network ensembles and modular neural network approaches, cooperative modular neural networks can decompose automatically and combines adaptively individual neural network models so that a global optimal solution of the original problem can be obtained. Reported results show that the cooperative modular neural networks can be well applied to classification and pattern recognition (Auda and Kamel 1997a, b, 1998a, b, 1999Zhang 2000;Lu and Ito 1999;Yang and Browne 2001;Oh and Suen 2002;Melin et al 2005;Fogelman-Soulie 1993;Hodge et al 1999;Kamel 1999;Alexandre et al 2001;Ozawa 1998;Islam et al 2003). Specially, in recent decade, as special one class of cooperative modular neural networks, cooperative recurrent modular neural networks for constrained optimization have been developed and well studied (Rodríguez-Vázquez et al 1990;Glazos et al 1998;Zhang and Constantinides 1992;He and Sun 2001;Tao and Fang 2000;Xia and Wang 1995, b, 2001, b, 2005Xia 1996aXia , b, 1997Xia , 2003Xia , 2004Xia et al 2002aXia et al , b, 2004aXia et al , b, 2005Xia et al , 2007Wang et al 2000;Tan et al 2000;Anguita and Boni 2002;Zhang et al 2003;Feng 2004, 2006;Kamel 2007a, b, c, d, 2008;Tao et al 2001;Leung et al 2001).…”
Section: Introductionmentioning
confidence: 99%
“…There are many different types of ensemble learning, i.e., independent training, sequential training and simultaneous training methods [20]. In this paper, however, we will focus on the application of ensemble averaging methods as an independent training method, and the Negative Correlation (NC) learning algorithm as a simultaneous training method.…”
Section: Cross Entropy Based Ensemble Backpropagation Neural Netwmentioning
confidence: 99%
“…However, there are some regression problems which cannot be adequately predicted when based on a single ANN because of the complexity of the problem and large volume of data [2]. Therefore the motivation for fusion of different ANNs is the potential for obtaining more accurate predictions compared with those which would be obtainable using single ANN [3].…”
Section: Introductionmentioning
confidence: 99%