2018
DOI: 10.1016/j.compind.2018.04.002
|View full text |Cite
|
Sign up to set email alerts
|

A deep Boltzmann machine and multi-grained scanning forest ensemble collaborative method and its application to industrial fault diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 80 publications
(30 citation statements)
references
References 22 publications
0
23
0
Order By: Relevance
“…Other researchers have endeavored to improve the classification accuracy by combining deep learning models with other models, such as the multi-grained cascade forest [164], fisher discriminative dictionary learning [165] and deep quantum neural network [166]. One commonality is they all used the deep learning model to learn feature representations and the other model to increase discriminative power.…”
Section: ) Structured Datamentioning
confidence: 99%
“…Other researchers have endeavored to improve the classification accuracy by combining deep learning models with other models, such as the multi-grained cascade forest [164], fisher discriminative dictionary learning [165] and deep quantum neural network [166]. One commonality is they all used the deep learning model to learn feature representations and the other model to increase discriminative power.…”
Section: ) Structured Datamentioning
confidence: 99%
“…Therefore, manual The most common model-based, few-shot learning technology is embedding learning [29]: the training set and testing set are denoted D train and D test , the embedding function F projects the training sample data X train ∈ D train to a low-dimensional space Z, the embedding function g projects the testing samples X test ∈ D test to Z, then a similarity equation S is used to predict the embedding similarity between classes. Zhang [30] used a Siamese network for bearing fault diagnosis, Vinyal [11] proposed a matching network as a semi-supervised method to assign unlabeled samples to augment D train via soft-assignment during learning, Sung [31] used a relational network to embed samples into the status space at the same time, and used a convolutional neural network to automatically find similarities between different image categories, and Snell [32] proposed a prototypical network, Instead of comparing f (x test ) with each f (xi) where xi ∈ D train , the prototypical network only compares f (x test ) with the class prototypes in D train . For class n, the prototype is calculated by the formula…”
Section: Few-shot Learningmentioning
confidence: 99%
“…Since x and z are real-value random variables on the probability space, the expect values of x and z can defined as the integral of x and z, respectively. Therefore, (14) can be re-written as…”
Section: Generative Adversarial Networkmentioning
confidence: 99%
“…Due to the fast training of ESN, Long et al [13] proposed a deep echo state network optimized by particle swarm optimization to fault diagnosis of a wind turbine gearbox. Hu et al [14] present an approach with deep Boltzmann machine and multi-grained scanning forest to effectively deal with industrial fault diagnosis. Wang et al [15] proposed a new deep neural network model based on a deep Boltzmann machine for condition prognosis.…”
Section: Introductionmentioning
confidence: 99%