1990
DOI: 10.1162/neco.1990.2.2.198
|View full text |Cite
|
Sign up to set email alerts
|

The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks

Abstract: A general method for building and training multilayer perceptrons composed of linear threshold units is proposed. A simple recursive rule is used to build the structure of the network by adding units as they are needed, while a modified perceptron algorithm is used to learn the connection strengths. Convergence to zero errors is guaranteed for any boolean classification on patterns of binary variables. Simulations suggest that this method is efficient in terms of the numbers of units constructed, and the netwo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
177
0
2

Year Published

1993
1993
2017
2017

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 437 publications
(185 citation statements)
references
References 1 publication
1
177
0
2
Order By: Relevance
“…There are some efforts for a systematic way to achieve this such as the research on constructive and pruning algorithms, (Burgess, 1994;Frean, 1990;LeCun et al, 1990;Reed, 1993). The former methods initially assume a minimal ANN and insert nodes and links as warranted whilst the latter proceeds in the opposite way, i.e.…”
Section: Evolutionary Annsmentioning
confidence: 99%
See 1 more Smart Citation
“…There are some efforts for a systematic way to achieve this such as the research on constructive and pruning algorithms, (Burgess, 1994;Frean, 1990;LeCun et al, 1990;Reed, 1993). The former methods initially assume a minimal ANN and insert nodes and links as warranted whilst the latter proceeds in the opposite way, i.e.…”
Section: Evolutionary Annsmentioning
confidence: 99%
“…Up-to-date designing a (near) optimal network architecture is made by a human expert and requires a tedious trial and error process. Especially automatic determination fall into two broad categories: constructive and pruning algorithms (Burgess, 1994;Frean, 1990;LeCun, Denker, & Solla, 1990;Reed, 1993), where many deficiencies and limitations have been reported (Angeline, Sauders, & Pollack, 1994). Efforts have been then focused on evolutionary algorithms (EAs) (Back & Schwefel, 1993) particularly over Genetic Algorithm (GA) (Goldberg, 1989) and Evolutionary Programming (EP), (Fayyad, Shapire, Smyth, & Uthurusamy, 1996) for both training and evolving ANNs.…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, a constructive (add node iteratively) and destructive (prune nodes iteratively) method [196]. However, the constructive and the destructive methods for optimizing architecture are no different from the (Fig.…”
Section: Architecture Plus Weight Optimizationmentioning
confidence: 99%
“…Upstart algorithm [3] constructs a binary tree of threshold neurons. First an output layer of M neurons is trained, if patterns are correctly classified, it terminates, else it finds a neuron that makes most number of errors, if it is wrongly-on or wrongly-off, daughter neurons are added to correct errors.…”
Section: Introductionmentioning
confidence: 99%
“…Each neuron is designed to exclude a cluster of patterns belonging to same class. The weights are the inter pattern distances [3] .…”
Section: Introductionmentioning
confidence: 99%