1999
DOI: 10.1016/s0305-0483(99)00027-4
|View full text |Cite
|
Sign up to set email alerts
|

Comparing backpropagation with a genetic algorithm for neural network training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
75
0
2

Year Published

2003
2003
2019
2019

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 191 publications
(77 citation statements)
references
References 10 publications
0
75
0
2
Order By: Relevance
“…The training phase may consist of several epochs. A popular approach used in the training phase is back-propagation learning; however, researchers have pointed out that some commonly used learning algorithms have disadvantages [10,11]. Several heuristic algorithms, including genetic algorithm (GA) [12], particle swarm optimization (PSO) [13] and ant colony optimization (ACO) [14] have been proposed for the purpose of training neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…The training phase may consist of several epochs. A popular approach used in the training phase is back-propagation learning; however, researchers have pointed out that some commonly used learning algorithms have disadvantages [10,11]. Several heuristic algorithms, including genetic algorithm (GA) [12], particle swarm optimization (PSO) [13] and ant colony optimization (ACO) [14] have been proposed for the purpose of training neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…However, one limitation of this technique, which is a gradient-descent technique, is that it requires a differentiable neuron transfer function. Also, as neural networks generate complex error surfaces with multiple local minima, the BP tends to converging into local minima instead of a global minimum [4].…”
Section: Introductionmentioning
confidence: 99%
“…Sankar K. Pal and Dinabandhu Bhandari [1] used binary-coded GA for selection of optimal weights in MLP. Gupta and Sexton [6], Huang and Chang [12] proposed genetic algorithms that outperform significantly than backpropagation in training the MLP. In this paper, we have proposed a real-coded GA for training the weights of a multilayer perceptron.…”
Section: Introductionmentioning
confidence: 99%