Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)
DOI: 10.1109/icnn.1994.374298
|View full text |Cite
|
Sign up to set email alerts
|

A weight evolution algorithm for multi-layered network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 2 publications
0
4
0
Order By: Relevance
“…ing BP falls off rapidly because gradient search techniques get trapped at local minima. When the nearly global minima's deeply hidden among the local minima, BP can end bouncing between local minima, especially for those nonlinearly separable pattern classification problems or complex function approximation problems (Leung, Luk, & Ng, 1994). The second drawback converges with the algorithm is sensitive to the early value, so it often converges to an inferior solution in a long training time.…”
Section: Learning Algorithms Of Artificial Neural Networkmentioning
confidence: 99%
“…ing BP falls off rapidly because gradient search techniques get trapped at local minima. When the nearly global minima's deeply hidden among the local minima, BP can end bouncing between local minima, especially for those nonlinearly separable pattern classification problems or complex function approximation problems (Leung, Luk, & Ng, 1994). The second drawback converges with the algorithm is sensitive to the early value, so it often converges to an inferior solution in a long training time.…”
Section: Learning Algorithms Of Artificial Neural Networkmentioning
confidence: 99%
“…Therefore, the population is encouraged toward improved solution areas of the solution space. Populationbased optimization algorithms are categorized into two sections, namely, evolutionary algorithms (EA) and SI-based algorithms [17,23]. In EA, the major plan underlying this combination is to take the weight matrices of the ANN as individuals, to change the weights by some operations such as crossover and mutation, and to use the error produced by the ANN as the fitness measure that guides selection.…”
Section: Evolutionary Algorithms (Ea)mentioning
confidence: 99%
“…The different learning rules form the basis of various training techniques and their applicability is dependent on the NN architecture and the learning category being used. MLP has different learning algorithms for searching optimal weight values to minimize the error term between the output of the NNs and the actual desired output value like BP [14][15][16][17]. The error term is calculated by comparing the net output to the desired output and is then fed back through the network causing the synaptic weights to be changed in an effort to minimize error.…”
Section: Training Artificial Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation