IEEE Proceedings on Southeastcon
DOI: 10.1109/secon.1990.117770
|View full text |Cite
|
Sign up to set email alerts
|

An efficient learning algorithm for the backpropagation artificial neural network

Abstract: AESTRACT Two conditions for reducing the number of learning iterations in back-propagation artificial neural network are introduced in this paper. The first condition is to scale the target output so that it falls within a small range 20.1 of the point at which the slope of tbe nonlinear activation function of the output node is maximum. This point is 0.5 for the sigmoid function.The second condition is to learn the input patterns selectively not sequentially till the error is reduced below the desired limit. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 2 publications
0
2
0
Order By: Relevance
“…Paul and Byrne [18] The work proposes conditions in order to reduce the number of learning iterations in backpropagation neural network. The first is to scale the output with respect to the input.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Paul and Byrne [18] The work proposes conditions in order to reduce the number of learning iterations in backpropagation neural network. The first is to scale the output with respect to the input.…”
Section: Literature Reviewmentioning
confidence: 99%
“…A way to classify pipe defects from PEC signals is through the application of Artificial Neural Networks (ANN) [5] [6] [7]. Back-propagation [8] is a classic method to train ANN and is based on minimization of Mean Square Error (MSE). For nondestructive evaluation on industrial equipment and parts, the characterization of the defect type is very important for definition of a proper maintenance procedure.…”
Section: Introductionmentioning
confidence: 99%