Rolling bearings are important components of rotating machines. For their preventive maintenance, it is not enough to know whether there is any fault or the fault type. For an effective maintenance, a fault severity monitoring needs to be conducted. Currently, the bearing fault diagnosis method based on time–frequency image (TFI) recognition is attracting increasing attention. This paper contributes to the ongoing investigation by proposing a new approach for the fault severity monitoring of rolling bearings based on the texture feature extraction of sparse TFIs. The first and main step is to obtain accurate TFIs from the vibration signals of rolling bearings. Traditional time–frequency analysis methods have disadvantages such as low resolution and cross-term interference. Therefore, the TFIs obtained cannot satisfactorily express the time–frequency characteristics of bearing vibration signals. To solve this problem, a sparse time–frequency analysis method based on the first-order primal-dual algorithm (STFA-PD) was developed in this paper. Unlike traditional time–frequency analysis methods, the time–frequency analysis model of the STFA-PD method is based on the theory of sparse representation, and is solved using the first-order primal-dual algorithm. For employing the sparse constraint in the frequency domain, the STFA-PD obtains a higher time–frequency resolution and is free from cross-term interference, as the model is based on a linear time–frequency analysis method. The gray level co-occurrence matrix is then employed to extract texture features from the sparse TFIs as input features for classifiers. Vibration signals of rolling bearings with different fault severity degrees are used to validate the proposed approach. The experimental results show that the developed STFA-PD outperforms traditional time–frequency analysis methods in terms of the accuracy and effectiveness for the fault severity monitoring of rolling bearings.
Fault diagnosis plays a very important role in ensuring the safe and reliable operations of machines. Currently, the deep learning-based fault diagnosis is attracting increasing attention. However, fault diagnosis under variable working conditions has been a significant challenge due to the domain discrepancy problem. This problem is also unavoidable in deep learning-based fault diagnosis methods. This paper contributes to the ongoing investigation by proposing a new approach for the fault diagnosis under variable working conditions based on STFT and transfer deep residual network (TDRN). The STFT was employed to convert vibration signal to time-frequency image as the input of the TDRN. To address the domain discrepancy problem, the TDRN was developed in this paper. Unlike traditional deep convolutional neural network (DCNN) methods, by combining with transfer learning, the TDRN can make a bridge between two different working conditions, thereby using the knowledge learned from a working condition to achieve a high classification accuracy in another working condition. Moreover, since the residual learning is introducing, the TDRN can overcome the problems of training difficulty and performance degradation existing in traditional DCNN methods, thus further improving the classification accuracy. Experiments are conducted on the popular CWRU bearing dataset to validate the effectiveness and superiority of the proposed approach. The results show that the developed TDRN outperforms those methods without transfer learning and/or residual learning in terms of the accuracy and feature learning ability for the fault diagnosis under variable working conditions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.