2022
DOI: 10.1007/s00521-022-06972-5
|View full text |Cite
|
Sign up to set email alerts
|

Improved graph-regularized deep belief network with sparse features learning for fault diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 39 publications
0
3
0
Order By: Relevance
“…It can be seen from the previous description that the dynamic weight µ is determined by the fixed factor c and the loss function value E. To increase the decisive role of E, it is appropriate to choose a value greater than 1 for c. To explore the best values of c 1 and c 2 , several sets of experiments were carried out in the transfer case A-D, and the experimental results are shown in figure 13. Among them, the value ranges of c 1 and c 2 were in [0.5, 1.5] and [1,3], respectively. Although higher accuracy can be obtained when c 1 is set to 0.5, considering the comprehensive dynamic weight strategy, c 1 = 1 and c 2 = 2.5 are selected as the optimal parameters.…”
Section: Comparison Of Different Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…It can be seen from the previous description that the dynamic weight µ is determined by the fixed factor c and the loss function value E. To increase the decisive role of E, it is appropriate to choose a value greater than 1 for c. To explore the best values of c 1 and c 2 , several sets of experiments were carried out in the transfer case A-D, and the experimental results are shown in figure 13. Among them, the value ranges of c 1 and c 2 were in [0.5, 1.5] and [1,3], respectively. Although higher accuracy can be obtained when c 1 is set to 0.5, considering the comprehensive dynamic weight strategy, c 1 = 1 and c 2 = 2.5 are selected as the optimal parameters.…”
Section: Comparison Of Different Methodsmentioning
confidence: 99%
“…Since the state of rotating mechanical parts has an important impact on the safe operation of machinery, it is necessary to conduct fault diagnosis research on components [1]. In recent years, with further research on deep learning, datadriven intelligent diagnosis algorithms have developed rapidly [2,3]. Their main advantage is that they can perform intelligent feature extraction on data from massive data [4,5].…”
Section: Introductionmentioning
confidence: 99%
“…More researchers are focusing on deep learning approaches. Yang et al [14] proposed the Gaussian-Bernoulli deep belief network (GDBN). The method applies graph regularization and sparse feature learning to the GDBN, which can generate discriminant features and is highly separable.…”
Section: Fault Diagnosismentioning
confidence: 99%
“…Assorted neural network architectures and enhanced techniques, such as Deep Belief Network (DBN) [18], Convolutional Neural Network (CNN) [19][20][21], Recurrent Neural Network (RNN) [22,23], Attention mechanism [24], Transformer [25,26] and their variants [27,28] have also been generally exploited for the fault diagnosis. Jie et al [29] established a novel Gaussian-Bernoulli deep belief network (GDBN) model for intelligent the same architecture. Moreover, the centering and sharpening operations included in the self-distillation with no labels algorithm are integrated into the proposed framework to train the model networks, which enable us effectively avoid the mode collapse during the training procedure.…”
Section: Introductionmentioning
confidence: 99%