2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC) 2019
DOI: 10.1109/itaic.2019.8785740
|View full text |Cite
|
Sign up to set email alerts
|

Orthogonal Flower Pollination Algorithm based Mixed Kernel Extreme Learning Machine for Analog Fault Prognositcs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 7 publications
0
3
0
Order By: Relevance
“…To evaluate the accuracy of the models developed with DLs, the criteria of Mean Squared Error (MSE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), and Regression (R) are used. The following equations show how to calculate them [38,39]: For this purpose, Equation (34) has been used to normalize the training and test patterns [40].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…To evaluate the accuracy of the models developed with DLs, the criteria of Mean Squared Error (MSE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), and Regression (R) are used. The following equations show how to calculate them [38,39]: For this purpose, Equation (34) has been used to normalize the training and test patterns [40].…”
Section: Methodsmentioning
confidence: 99%
“…To evaluate the accuracy of the models developed with DLs, the criteria of Mean Squared Error ( MSE ), Root Mean Square Error ( RMSE ), Mean Absolute Percentage Error ( MAPE ), and Regression ( R ) are used. The following equations show how to calculate them [38, 39]: RMSEbadbreak=l=1N()YlŶl2N$$\begin{equation} RMSE = \sqrt {\frac{{\mathop \sum \nolimits_{l = 1}^N {{\left( {{Y_l} - {{\hat Y}_l}} \right)}^2}}}{N}} \end{equation}$$ MSEbadbreak=l=1Nfalse(YltrueŶlfalse)2N$$\begin{equation} MSE = \frac{{\mathop \sum \nolimits_{l = 1}^N {{( {{Y_l} - {{\hat Y}_l}})}^2}}}{N}\end{equation}$$ MAPEbadbreak=l=1N||YltrueŶlYlN$$\begin{equation} MAPE = \frac{{\mathop \sum \nolimits_{l = 1}^N \left| {\frac{{{Y_l} - {{\hat Y}_l}}}{{{Y_l}}}} \right|}}{N}\end{equation}$$ Rbadbreak=outputgoodbreak=a.targetgoodbreak+b$$\begin{equation} R = output = a^{\prime}.target + b\end{equation}$$…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation