2024
DOI: 10.1109/tnnls.2023.3249207
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Clustering via Co-Association Matrix Self-Enhancement

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…Therefore, we adopt adaptive weights to achieve it. Next, we introduce the CA matrix enhancement assumption A = C + E in the ECCMS model (Jia et al 2023), which considers that the CA matrix in real data is the sum of pure CA matrix and noise matrix. We embed it into the high-order model with the following formula:…”
Section: Our Proposed Methodologymentioning
confidence: 99%
“…Therefore, we adopt adaptive weights to achieve it. Next, we introduce the CA matrix enhancement assumption A = C + E in the ECCMS model (Jia et al 2023), which considers that the CA matrix in real data is the sum of pure CA matrix and noise matrix. We embed it into the high-order model with the following formula:…”
Section: Our Proposed Methodologymentioning
confidence: 99%
“…Then, the framework applies the Density-Based Spatial Clustering of Applications with Noise (DBSCAN) clustering algorithm [17], [18] and the Kernel Density Clustering (KDC) algorithm [19], [20] to split the data into clusters. Then, the output clusters are fed to the co-association matrix to check data similarity [21]- [24]. After that, the AML-CTP extracts a data sample with the same probability distribution as the original population.…”
Section: Introductionmentioning
confidence: 99%