2021
DOI: 10.1109/tnnls.2020.3015795
|View full text |Cite
|
Sign up to set email alerts
|

Fast and Effective Active Clustering Ensemble Based on Density Peak

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 61 publications
0
10
0
Order By: Relevance
“…The purpose of EC is to combine multiple base clusterings into a better and more robust consensus clustering [21], [22], [23], [24], [25], [26], [27], [28], [29]. In the final stage of our FastMICE approach, multiple base clusterings are fused into a unified clustering result, which can be viewed as an EC process.…”
Section: Ensemble Clusteringmentioning
confidence: 99%
See 1 more Smart Citation
“…The purpose of EC is to combine multiple base clusterings into a better and more robust consensus clustering [21], [22], [23], [24], [25], [26], [27], [28], [29]. In the final stage of our FastMICE approach, multiple base clusterings are fused into a unified clustering result, which can be viewed as an EC process.…”
Section: Ensemble Clusteringmentioning
confidence: 99%
“…Despite the progress of these EC works [21], [22], [23], [24], [25], [26], [27], [28], [29], most of them are devised for single-view datasets and lack the consideration of multiview scenarios. Recently Tao et al [31] proposed a multiview ensemble clustering (MVEC) method, which learns a consensus clustering from the multiple co-association matrices built in multiple views with low-rank and sparse constraints.…”
Section: Ensemble Clusteringmentioning
confidence: 99%
“…Wang et al proposed an active learning method based on density clustering (ALEC) that takes the structure of the data into account and selects the most representative instances [25]. And then, Shi et al proposed an active density peak (ADP) clustering algorithm that considers both representativeness and informativeness, while informative instances are queried to reduce the uncertainty of clustering results [26]. These algorithms are based on clustering to ensure the minimum distance between two samples.…”
Section: Activementioning
confidence: 99%
“…In Equation ( 25), the learning rate ρ (m) is sampled from N (0, I) and the symbol * denotes the element-wise multiplication operator. µ z and σ z are learned by the GCN formulated in Equation (14).…”
Section: Learning Algorithmmentioning
confidence: 99%
“…The base clustering members can be produced in many ways, such as by using different clustering algorithms, employing one clustering algorithm with different parameters, and utilizing subsets of data or features [13]. The typical methods used to design a consensus function include a relabeling strategy [14], feature-based methods [15,16], pairwise-similarity-based algorithms [17,18], and graph-based approaches [19,20]. In general, three ensemble-information matrices [11] can be acquired from the base clustering results: the label-assignment matrix, pairwise similarity matrix, and binary cluster association matrix.…”
Section: Introductionmentioning
confidence: 99%