2021
DOI: 10.1109/access.2021.3059459
|View full text |Cite
|
Sign up to set email alerts
|

Deep Clustering Bearing Fault Diagnosis Method Based on Local Manifold Learning of an Autoencoded Embedding

Abstract: In many practical fault diagnosis applications, the acquisition of fault data labels requires substantial manpower and resources, which are sometimes impossible to achieve. To address this, an unsupervised bearing fault diagnosis method based on deep clustering is proposed. In this method, an autoencoder is initially applied to the signal spectrum to learn the initial representation. Then, its potential manifold is further searched, and a Gaussian mixture model is finally used for clustering. Experiments condu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 26 publications
(34 reference statements)
0
8
0
Order By: Relevance
“…We opted for a manifold learning technique to re-embed the data and aimed to learn the entire embedded manifold to optimize the clustering. While the autoencoder we used was a good choice for learning a meaningful data representation, it did not consider the local structure [ 56 ]. By combining the autoencoder with a manifold learning technique that considered the local structure, we could enhance the quality of the representation in terms of the clusterability.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…We opted for a manifold learning technique to re-embed the data and aimed to learn the entire embedded manifold to optimize the clustering. While the autoencoder we used was a good choice for learning a meaningful data representation, it did not consider the local structure [ 56 ]. By combining the autoencoder with a manifold learning technique that considered the local structure, we could enhance the quality of the representation in terms of the clusterability.…”
Section: Methodsmentioning
confidence: 99%
“…UMAP combines Riemannian geometry, algebraic topology, and ML techniques to find a low-dimensional data representation that retains its structure. Unlike dimensionality reduction algorithms such as t-distributed stochastic neighbor embedding (t-SNE) [ 58 ] and principal component analysis (PCA) [ 59 ], UMAP can preserve the data’s local and global structure, and the algorithm can be adjusted through various hyperparameters, giving users greater control over the process [ 56 , 57 ]. Additionally, UMAP is less sensitive to changes in hyperparameters.…”
Section: Methodsmentioning
confidence: 99%
“…The fused feature vectors are used to machine health indicators. [107] proposed a simple deep learning clustering method, inspired by [43], which applied manifold learning to off-the-shelf embeddings to find an alternative model for clustering networks and simply combined the manifold learning 2332-7782 (c) 2021 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.…”
Section: E Autoencodermentioning
confidence: 99%
“…In the study, clustering accuracy (ACC) [36] and normalized mutual information (NMI) [47] are used to evaluate the effectiveness of the proposed method without supervision. ACC and NMI are quite popular evaluation indicators in the field of machine learning, which can effectively measure the degree of matching between the clustering results and the actual class labels.…”
Section: Evaluation Metricsmentioning
confidence: 99%
“…Although a lot of achievements have been made in the research of deep clustering [43,44,45,46], there are few applications in the field of bearing fault diagnosis. An et al [47] proposed a deep clustering method based on autoencoded embedding and local manifold learning. Experiments on CWRU bearing dataset showed that this method could find the optimal clustering manifold and got better clustering results than current advanced baseline methods.…”
Section: Introductionmentioning
confidence: 99%