1998
DOI: 10.1016/s0167-9236(97)00040-7
|View full text |Cite
|
Sign up to set email alerts
|

Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
91
0
5

Year Published

2001
2001
2018
2018

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 221 publications
(96 citation statements)
references
References 15 publications
0
91
0
5
Order By: Relevance
“…On the other hand, a matrix-based representation of weights, where a column-wise and a row-wise crossover operators were also defined [152]. The GA-based real coded weights optimization outperforms BP and its variants for solving real-world applications [153][154][155][156]. Moreover, an evolutionary inspired algorithm, called differential evolution (DE) [105,157] that imitates mutation and crossover operator to solve complex continuous optimization problems was found to be performing efficiently for real-valued weight vector optimization [158][159][160].…”
Section: Weight Optimizationmentioning
confidence: 99%
“…On the other hand, a matrix-based representation of weights, where a column-wise and a row-wise crossover operators were also defined [152]. The GA-based real coded weights optimization outperforms BP and its variants for solving real-world applications [153][154][155][156]. Moreover, an evolutionary inspired algorithm, called differential evolution (DE) [105,157] that imitates mutation and crossover operator to solve complex continuous optimization problems was found to be performing efficiently for real-valued weight vector optimization [158][159][160].…”
Section: Weight Optimizationmentioning
confidence: 99%
“…Unlike the backpropagation using gradient method, a GA can avoid local minimum traps while performing a global search for best set of connection weights. Literature has reported on the superiority of a set of network weights selected by GA (Whitley et al, 1990;Sexton et al, 1998).…”
Section: Genetic Algorithms In Ann Optimizationmentioning
confidence: 99%
“…Mizuno et al (1998), employ ANN to Tokyo stock exchange to predict buying and selling signals with an overall prediction rate of 63%. Sexton et al (1998) concluded that the use of momentum and start of learning at random points may solve the problems that may occur in training processes. Phua et al (2000), applied neural networks with genetic algorithms to the stock exchange market of Singapore and predicted the market direction with an accuracy of 81%.…”
Section: Literature Reviewmentioning
confidence: 99%