2000
DOI: 10.1016/s0925-2312(00)00302-7
|View full text |Cite
|
Sign up to set email alerts
|

G-Prop: Global optimization of multilayer perceptrons using GAs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
62
0
3

Year Published

2003
2003
2014
2014

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 134 publications
(67 citation statements)
references
References 24 publications
2
62
0
3
Order By: Relevance
“…After only 5 iterations, and in less than 1 s, the number of misclassiÿed cases dropped to 2, which corresponds to an error of only 3.3%. Table 4 compares these results with that obtained by other authors [6]. We can see that H-LVQ outperforms other methods, including simulated annealing.…”
Section: Application To Dna Helicasesmentioning
confidence: 59%
“…After only 5 iterations, and in less than 1 s, the number of misclassiÿed cases dropped to 2, which corresponds to an error of only 3.3%. Table 4 compares these results with that obtained by other authors [6]. We can see that H-LVQ outperforms other methods, including simulated annealing.…”
Section: Application To Dna Helicasesmentioning
confidence: 59%
“…This paper continues the research on evolutionary optimization of multilayer perceptrons (MLPs) (G-Prop method) presented in [3,5], comparing several hybrid systems to optimize MLPs. The G-Prop method leverages the capabilities of two classes of algorithms: the ability of EA to find a solution close to the global optimum, and the ability of the back-propagation algorithm (BP) to tune a solution and reach the nearest local minimum by means of local search from the solution found by the EA.…”
Section: Introductionmentioning
confidence: 71%
“…We can see how the classification ability improves and the simulation time grows as the generations number is increased. This model was presented and used in previous papers to solve patern classification problems [3], having obtained better results than other methods.…”
Section: Obtained Resultsmentioning
confidence: 99%
“…The complete description of the method and the results obtained using classification problems have been presented elsewhere [6], [7], [8], [9]. The designed method uses an elitist algorithm [29].…”
Section: The Methodsmentioning
confidence: 99%
“…We propose to focus the effort on the ANN optimization using GProp [6], [7], [8], [9], an evolutionary method for the design and optimization of neural networks.…”
Section: Introductionmentioning
confidence: 99%