2004
DOI: 10.1007/978-3-540-30116-5_9
|View full text |Cite
|
Sign up to set email alerts
|

Combining Multiple Clustering Systems

Abstract: Abstract. Three methods for combining multiple clustering systems are presented and evaluated, focusing on the problem of finding the correspondence between clusters of different systems. In this work, the clusters of individual systems are represented in a common space and their correspondence estimated by either "clustering clusters" or with Singular Value Decomposition. The approaches are evaluated for the task of topic discovery on three major corpora and eight different clustering algorithms and it is sho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2006
2006
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 51 publications
(31 citation statements)
references
References 16 publications
0
31
0
Order By: Relevance
“…Each base clustering provides a cluster label as a new feature describing each data point [Figure 2(b)], which is utilised to formulate the final solution (Boulis and Ostendorf, 2004;Nguyen and Caruana, 2007;Topchy et al, 2005). For instance, the technique of Boulis and Ostendorf (2004) makes use of linear programming to find a correspondence between the labels of base clusterings and those of the optimal final-clustering.…”
Section: Consensus Functionsmentioning
confidence: 99%
See 2 more Smart Citations
“…Each base clustering provides a cluster label as a new feature describing each data point [Figure 2(b)], which is utilised to formulate the final solution (Boulis and Ostendorf, 2004;Nguyen and Caruana, 2007;Topchy et al, 2005). For instance, the technique of Boulis and Ostendorf (2004) makes use of linear programming to find a correspondence between the labels of base clusterings and those of the optimal final-clustering.…”
Section: Consensus Functionsmentioning
confidence: 99%
“…Each base clustering provides a cluster label as a new feature describing each data point [Figure 2(b)], which is utilised to formulate the final solution (Boulis and Ostendorf, 2004;Nguyen and Caruana, 2007;Topchy et al, 2005). For instance, the technique of Boulis and Ostendorf (2004) makes use of linear programming to find a correspondence between the labels of base clusterings and those of the optimal final-clustering. In addition, the aggregation of multiple clustering results has been considered as a maximum likelihood estimation problem, and EM algorithms (Nguyen and Caruana, 2007;Topchy et al, 2004Topchy et al, , 2005 have been proposed for finding the consensus clustering.…”
Section: Consensus Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Clustering aggregation has has been previously considered under a variety of names (consensus clustering, clustering ensemble, clustering combination) in a variety of different areas: machine learning [19,12], pattern recognition [14], bio-informatics [13], and data mining [21,5]. The problem of correlation clustering is interesting in its own right, and it has recently attracted a lot of attention in the theoretical computer-science community [2,6,8,10].…”
Section: Figure 1 An Example Of Clustering Aggregationmentioning
confidence: 99%
“…They propose information theoretic distance measures, and they propose genetic algorithms for finding the best aggregation solution. Boulis and Ostendorf [5] use Linear Programming to discover a correspondence between the labels of the individual clusterings and those of an "optimal" meta-clustering. Topchy et al [21] define clustering aggregation as a maximum likelihood estimation problem, and they propose an EM algorithm for finding the consensus clustering.…”
Section: Related Workmentioning
confidence: 99%