2012
DOI: 10.1007/978-3-642-27443-5_88
|View full text |Cite
|
Sign up to set email alerts
|

Artificial Neural Network Training Using Differential Evolutionary Algorithm for Classification

Abstract: Abstract. In this work, we proposed a method of artificial neural network learning using differential evolutionary(DE) algorithm. DE with global and local neighborhood based mutation(DEGL) algorithm is used to search the synaptic weight coefficients of neural network and to minimize the learning error in the error surface.DEGL is a version of DE algorithm in which both global and local neighborhood-based mutation operator is combined to create donor vector.The proposed method is applied for classification of r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 8 publications
0
4
0
Order By: Relevance
“…The SDE gives better results when compared to training ANN by GA algorithm and standard DE on four sets of the UCI dataset. Si et al [25] presented a DEGL approach to combine global and local mutation steps to create a candidate population vector using the mutation probability adaptation to balance the searchability. They tested the method on four benchmark functions and seven classification problems.…”
Section: Training Neural Network Using Differential Evolutionmentioning
confidence: 99%
“…The SDE gives better results when compared to training ANN by GA algorithm and standard DE on four sets of the UCI dataset. Si et al [25] presented a DEGL approach to combine global and local mutation steps to create a candidate population vector using the mutation probability adaptation to balance the searchability. They tested the method on four benchmark functions and seven classification problems.…”
Section: Training Neural Network Using Differential Evolutionmentioning
confidence: 99%
“…The difficulty is associated mainly with the non-linearity of the problem that the network is meant to solve, the lack of knowledge regarding the best set of weights and biases, and the dependency of the performance of the training algorithm on the architectural aspects of the network (specifically, the topology and activation functions of the neurons). Hence, given their utility as alternatives to backpropagation and its variants [21], heuristic search algorithms were used to optimize ANNs [26,18,3,28].…”
Section: Prior Related Workmentioning
confidence: 99%
“…At the same time, Mirjalili et al [23] introduced a combination of the PSO and GSA to optimize the FNN, where they presented an acceleration coefficient in GSA to improve the performance of the overall algorithm. Si et al [24] developed a differential evolution (DE)-based method to search for the ANN's synaptic weight coefficients' optimal values. Shaw and Kinsner [25] presented the simulated annealing (SA) based on a chaotic strategy to train FNN, to mitigate the possibility of sticking in the local optima.…”
Section: Introductionmentioning
confidence: 99%