Accurately predicting the remaining useful life (RUL) of the turbofan engine is of great significance for improving the reliability and safety of the engine system. Due to the high dimension and complex features of sensor data in RUL prediction, this paper proposes four data-driven prognostic models based on deep neural networks (DNNs) with an attention mechanism. To improve DNN feature extraction, data are prepared using a sliding time window technique. The raw data collected after normalizing is simply fed into the suggested network, requiring no prior knowledge of prognostics or signal processing and simplifying the proposed method's applicability. In order to verify the RUL prediction ability of the proposed DNN techniques, the C-MAPSS benchmark dataset of the turbofan engine system is validated. The experimental results showed that the developed long short-term memory (LSTM) model with attention mechanism achieved accurate RUL prediction in both scenarios with a high degree of robustness and generalization ability. Furthermore, the proposed model performance outperforms several state-of-the-art prognosis methods, where the LSTM-based model with attention mechanism achieved an RMSE of 12.87 and 11.23 for FD002 and FD003 subset of data, respectively.
M-Learning is a new learning paradigm of the new
High blood pressure (BP) may lead to further health complications if not monitored and controlled, especially for critically ill patients. Particularly, there are two types of blood pressure monitoring, invasive measurement, whereby a central line is inserted into the patient’s body, which is associated with infection risks. The second measurement is cuff-based that monitors BP by detecting the blood volume change at the skin surface using a pulse oximeter or wearable devices such as a smartwatch. This paper aims to estimate the blood pressure using machine learning from photoplethysmogram (PPG) signals, which is obtained from cuff-based monitoring. To avoid the issues associated with machine learning such as improperly choosing the classifiers and/or not selecting the best features, this paper utilized the tree-based pipeline optimization tool (TPOT) to automate the machine learning pipeline to select the best regression models for estimating both systolic BP (SBP) and diastolic BP (DBP) separately. As a pre-processing stage, notch filter, band-pass filter, and zero phase filtering were applied by TPOT to eliminate any potential noise inherent in the signal. Then, the automated feature selection was performed to select the best features to estimate the BP, including SBP and DBP features, which are extracted using random forest (RF) and k-nearest neighbors (KNN), respectively. To train and test the model, the PhysioNet global dataset was used, which contains 32.061 million samples for 1000 subjects. Finally, the proposed approach was evaluated and validated using the mean absolute error (MAE). The results obtained were 6.52 mmHg for SBS and 4.19 mmHg for DBP, which show the superiority of the proposed model over the related works.
<span lang="EN-US">In this era, machines can understand human activities and their meanings. We can utilize this ability of machines in various fields or applications. One specific field of interest is a prediction of churning customers in any industry. Prediction of churning customers is the state of art approach which predicts which customer is near to leave the services of the specific bank. We can use this approach in any big organization that is very conscious about their customers. However, this study aims to develop a model that offers a meaningful churn prediction for the banking industry. For this purpose, we develop a customer churn prediction approach with the three intelligent models Random Forest (RF), AdaBoost, and Support Vector Machine (SVM). This approach achieves the best result when the Synthetic Minority Oversampling Technique (SMOTE) is applied to overcome the unbalanced dataset and the combination of undersampling and oversampling. The method on SMOTED data has produced excellent results with a 91.90 F1 score and overall accuracy of 88.7% using RF. Furthermore, the experimental results show that RF yielded good results for the full feature-selected datasets.</span>
The entire life cycle of a turbofan engine is a type of asymmetrical process in which each engine part has different characteristics. Extracting and modeling the engine symmetry characteristics is significant in improving remaining useful life (RUL) predictions for aircraft components, and it is critical for an effective and reliable maintenance strategy. Such predictions can improve the maximum operating availability and reduce maintenance costs. Due to the high nonlinearity and complexity of mechanical systems, conventional methods are unable to satisfy the needs of medium- and long-term prediction problems and frequently overlook the effect of temporal information on prediction performance. To address this issue, this study presents a new attention-based deep convolutional neural network (DCNN) architecture to predict the RUL of turbofan engines. The prognosability metric was used for feature ranking and selection, whereas a time window method was employed for sample preparation to take advantage of multivariate temporal information for better feature extraction by means of an attention-based DCNN model. The validation of the proposed model was conducted using a well-known benchmark dataset and evaluation measures such as root mean square error (RMSE) and asymmetric scoring function (score) were used to validate the proposed approach. The experimental results show the superiority of the proposed approach to predict the RUL of a turbofan engine. The attention-based DCNN model achieved the best scores on the FD001 independent testing dataset, with an RMSE of 11.81 and a score of 223.
Anomaly detection in high dimensional data is a critical research issue with serious implication in the real-world problems. Many issues in this field still unsolved, so several modern anomaly detection methods struggle to maintain adequate accuracy due to the highly descriptive nature of big data. Such a phenomenon is referred to as the "curse of dimensionality" that affects traditional techniques in terms of both accuracy and performance. Thus, this research proposed a hybrid model based on Deep Autoencoder Neural Network (DANN) with five layers to reduce the difference between the input and output. The proposed model was applied to a real-world gas turbine (GT) dataset that contains 87620 columns and 56 rows. During the experiment, two issues have been investigated and solved to enhance the results. The first is the dataset class imbalance, which solved using SMOTE technique. The second issue is the poor performance, which can be solved using one of the optimization algorithms. Several optimization algorithms have been investigated and tested, including stochastic gradient descent (SGD), RMSprop, Adam and Adamax. However, Adamax optimization algorithm showed the best results when employed to train the DANN model. The experimental results show that our proposed model can detect the anomalies by efficiently reducing the high dimensionality of dataset with accuracy of 99.40%, F1-score of 0.9649, Area Under the Curve (AUC) rate of 0.9649, and a minimal loss function during the hybrid model training.
Anomaly detection has been used to detect and analyze anomalous elements from data for years. Various techniques have been developed to detect anomalies. However, the most convenient one is Machine learning which is performing well but still has limitations for large-scale unlabeled datasets. Deep Reinforcement Learning (DRL) based techniques outperform the existing supervised or unsupervised and other alternative techniques for anomaly detection. This study presents a Systematic Literature Review (SLR), which analyzes DRL models that detect anomalies in their application. This SLR aims to analyze the DRL frameworks for anomaly detection applications, proposed DRL methods, and their performance comparisons against alternative methods. In this review, we have identified 32 research articles from 2017-2022 that discuss DRL techniques for various anomaly detection applications. After analyzing the selected research articles, this paper presents 13 different applications of anomaly detection found in the selected research articles. We identified 50 different datasets applied in experiments on anomaly detection and demonstrated 17 distinct DRL models used in the selected papers to detect anomalies. Finally, we analyzed the performance of these DRL models and reviewed them. Additionally, we observed that detecting anomalies using DRL frameworks is a promising area of research and showed that DRL had shown better performance for anomaly detection where other models lack. Therefore, we provide researchers with recommendations and guidelines based on this review.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with đź’™ for researchers
Part of the Research Solutions Family.