2013
DOI: 10.9790/3021-03340813
|View full text |Cite
|
Sign up to set email alerts
|

Performance of ANN in Pattern Recognition For Process Improvement Using Levenberg- Marquardt And Quasi-Newton Algorithms

Abstract: Abstract:In Industrial manufacturing, Quality has become one of the most important consumer decision factors in the selection among competing products and services. Product inspection is an important step in the production process. Since product reliability is most important in mass production facilities. Neural networks are used to model complex relationships between inputs and outputs or to find patterns in data. Neural networks are being successfully applied across a wide range of application domains in bus… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 4 publications
0
4
0
Order By: Relevance
“…To accomplish this, it relies on the utilization of both the first and second derivatives of the function. In cases involving higher dimensions, Newton's method extends its application by incorporating the gradient and the Hessian matrix, which encapsulates the second derivatives of the function, with the objective of function minimization 100 .…”
Section: Theory and Methodologymentioning
confidence: 99%
“…To accomplish this, it relies on the utilization of both the first and second derivatives of the function. In cases involving higher dimensions, Newton's method extends its application by incorporating the gradient and the Hessian matrix, which encapsulates the second derivatives of the function, with the objective of function minimization 100 .…”
Section: Theory and Methodologymentioning
confidence: 99%
“…Second-order methods are based on the Newton, Gauss-Newton or Levenberg-Marquardt (LM) update rules [19][20][21][22]. LM training yields better results, faster, than first-order methods [19,23,24]. However, LM cannot be used to train large-scale ANNs, even on modern computers, because of complexity issues [22].…”
Section: Second-order Updatesmentioning
confidence: 99%
“…Second-order order training via the LM algorithm [19] is known to get better results than first-order methods in fewer iterations [19][20][21][22][23][24]. The MATLAB documentation states, "trainlm is often the fastest back-propagation algorithm in the toolbox, and is highly recommended as a first-choice supervised algorithm, although it does require more memory than other algorithms."…”
Section: First-order Training Analysismentioning
confidence: 99%
“…The dataset available for the training and validation of the network was referred from Santhosh et al [11] and consists of the sampling of 37 signals from instrument readings which hence represent the input of the ANN implemented. Also the ANN architecture adopted in the current study corresponds to that suggested by Santhosh et al [11] and consists of a fully connected multilayer network trained adopting the well-known LevenbergMarquardt algorithm, which has been demonstrated to outperform the classic backpropagation algorithm as well as quasi-Newton algorithms in pattern recognition applications [25], [26]. The available dataset was split into three subsets, one dedicated to training containing 70% of the overall data and the other two used for test and validation, overall including 30% of the initial data.…”
Section: A Case Studymentioning
confidence: 99%