2011
DOI: 10.1016/j.ins.2011.04.013
|View full text |Cite
|
Sign up to set email alerts
|

Minimum spanning tree based split-and-merge: A hierarchical clustering method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
46
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 97 publications
(46 citation statements)
references
References 44 publications
(50 reference statements)
0
46
0
Order By: Relevance
“…Alternative clustering algorithms, such as the divisive method of [28] may allow more diversely shaped clusters of documents to be found more easily, while different methods of merging or splitting clusters during the construction of the hierarchy can significantly affect the final hierarchy [29]. As our focus was on the associative network, not the effect of different clustering algorithms, examining this effect is left as future work.…”
Section: Discussionmentioning
confidence: 99%
“…Alternative clustering algorithms, such as the divisive method of [28] may allow more diversely shaped clusters of documents to be found more easily, while different methods of merging or splitting clusters during the construction of the hierarchy can significantly affect the final hierarchy [29]. As our focus was on the associative network, not the effect of different clustering algorithms, examining this effect is left as future work.…”
Section: Discussionmentioning
confidence: 99%
“…First, the given dataset X is divided into k partitions using K-means algorithm, where k is set to √ n [7]. Then, the local neighborhood graph G LN is obtained by considering the points in the neighboring partitions.…”
Section: Mst-based Clustering On K-means Graphmentioning
confidence: 99%
“…In order to demonstrate that the local neighborhood graphs KMLNG and BMLNG will not miss any information that is significant for clustering, we test the performance of the clustering results on the approximate MST obtanied from KMLNG and BMLNG using Zahn's clustering algorithm [3]. The results are validated using the external quality indices such as Rand, FM, Jaccard and Adjusted Rand [7]. Table 3 shows the quality indices of clustering on various datasets.…”
Section: Algorithm 2 Bi-means Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…This is the case of agglomerative hierarchical clustering methods [22,25,34]. These algorithms work in a bottom-up fashion, starting out by considering each instance as a cluster, and iteratively combining the two most similar clusters in terms of a similarity function.…”
Section: Introductionmentioning
confidence: 99%