1993
DOI: 10.1007/bf01189880
|View full text |Cite
|
Sign up to set email alerts
|

Neural network constructive algorithms: Trading generalization for learning efficiency?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

1993
1993
2012
2012

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 56 publications
(12 citation statements)
references
References 13 publications
0
12
0
Order By: Relevance
“…Different constructive learning algorithms allow trading off certain performance measures (e.g., learning time) for others (e.g., network size and generalization accuracy) [47].…”
Section: • Estimation Of Expected Case Complexity Of the Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…Different constructive learning algorithms allow trading off certain performance measures (e.g., learning time) for others (e.g., network size and generalization accuracy) [47].…”
Section: • Estimation Of Expected Case Complexity Of the Learningmentioning
confidence: 99%
“…These algorithms differ in terms of their choices regarding: restrictions on input representation (e.g., binary or bipolar valued inputs); when to add a neuron; where to add a neuron; connectivity of the added neuron; weight initialization for the added neuron; how to train the added neuron (or a subnetwork affected by the addition); and so on. They can be shown to converge to networks which yield zero classification errors on any noncontradictory training set involving two output classes (see [47] for a unifying framework that explains the convergence properties of these constructive algorithms). A geometrical analysis of the decision boundaries of some of these algorithms is presented in [7].…”
Section: ) Constructive Learning Using Iterative Weight Updatementioning
confidence: 99%
See 1 more Smart Citation
“…Valle (2005) presents various approaches to built smart adaptive devices. There still some topics as hardware friendly learning algorithms, as pertubation learning (Jabri and Flower, 1991), constructive learning (Smieja, 1993), cascade error projection learning (Duong, 1995;Duong and Stubberud, 1995), and local learning (Chen et al, 2000). Some HNN based on Multilayer Perceptron (MLP) (D'Acierno, 2000;Kumar et al, 1994), radial basis function (Fakhraie et al, 1994;Verleysen et al, 1994;Yang and Paindavoine, 2005), and Neocognition (Patnaik and Rao, 2000) and neurocomputers (Glesner and Poechmueller, 1994;Strey and Avellana, 2000).…”
Section: Introductionmentioning
confidence: 99%
“…-Trade-offs among performance measures. Different constructive learning methods allow tradeoffs in certain performance measures, such as tradeoffs between learning time and accuracy [122]. -Incorporation of prior knowledge.…”
Section: Cooperative Approach In Evolutionary Computationmentioning
confidence: 99%