2003
DOI: 10.1007/3-540-45105-6_89
|View full text |Cite
|
Sign up to set email alerts
|

A Generalized Feedforward Neural Network Architecture and Its Training Using Two Stochastic Search Methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2008
2008
2020
2020

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 2 publications
0
3
0
Order By: Relevance
“…GFNs are a generalization of the Multi-Layer Perceptrons (MLPs), so that connections can be over one or more layers [63,64]. In theory, MLP models can analyze an algorithm that GFN models can solve, but, in practice, GFNs are much more efficient in exploring such a connection due to the smaller number of training epochs required by the algorithm [65]. There are no specific rules for defining the control parameters, and they are mostly based on the results of previous studies and experts' opinions [66,67].…”
Section: The Model Using Annmentioning
confidence: 99%
“…GFNs are a generalization of the Multi-Layer Perceptrons (MLPs), so that connections can be over one or more layers [63,64]. In theory, MLP models can analyze an algorithm that GFN models can solve, but, in practice, GFNs are much more efficient in exploring such a connection due to the smaller number of training epochs required by the algorithm [65]. There are no specific rules for defining the control parameters, and they are mostly based on the results of previous studies and experts' opinions [66,67].…”
Section: The Model Using Annmentioning
confidence: 99%
“…The training and test dataset ration is 3:1. The GFNNs are a generalization of the conventional multilayer perceptron such that connections can jump over one or more layers (Arulampalam and Bouzerdoum, 2003;Bouzerdoum and Mueller, 2003). Theoretical and mathematical details are available in Rojas (1996).…”
Section: Development Of Artificial Neural Network (Ann)mentioning
confidence: 99%
“…Generalized feedforward neural networks (GFFNN) are a generalization of the MLP such that connections can jump over one or more layers (Arulampalam and Bouzerdoum, 2003;Bouzerdoum and Mueller, 2003).…”
Section: Brief Review Of the Ann Models Under Studymentioning
confidence: 99%