1995
DOI: 10.1016/0167-8191(95)00017-i
|View full text |Cite
|
Sign up to set email alerts
|

Parallel algorithms for hierarchical clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
166
0
3

Year Published

1997
1997
2015
2015

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 334 publications
(171 citation statements)
references
References 19 publications
(4 reference statements)
0
166
0
3
Order By: Relevance
“…One is the hierarchical method, which re-partitions the set until a stopping condition is met, or an aggregation presses which begins by considering each point as a cluster and then merging close clusters until a stopping condition is met [11,15,18,23,24]. The second is the k-means heuristic, where a mean of a cluster is the average of the clusters points.…”
Section: Clusteringmentioning
confidence: 99%
See 1 more Smart Citation
“…One is the hierarchical method, which re-partitions the set until a stopping condition is met, or an aggregation presses which begins by considering each point as a cluster and then merging close clusters until a stopping condition is met [11,15,18,23,24]. The second is the k-means heuristic, where a mean of a cluster is the average of the clusters points.…”
Section: Clusteringmentioning
confidence: 99%
“…A clustering method called BRITCH is presented in [28]. It uses the hierarchical clustering algorithm from [18]. BRITCH works well for compact well separated clusters, it does not work well when clusters have different sizes or are connected, even by outliers.…”
Section: Clustering With Outliersmentioning
confidence: 99%
“…These algorithms deal today with a vast amount of input, as introduced for instance by high throughput experiments in biology [10] and by data mining of the world wide web [12]. Throughout the years, extensive research has been done dealing with efficient implementations for such algorithms [19,15,4,9,7,17,6,1]. In this paper we focus on a common class of hierarchical clustering algorithms, which we call Globally Closest Pair (or GCP) clustering algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…The parallelization of MSTs dates as back as far as 1980 [10]. Olson [100] achieved a time complexity of O(V log(V ))) with V / log(V ) processors, with V being the number of vertices of the graph, and Johnson and Metaxas [71] of O(log 3/2 V ) with V + E processors.…”
Section: Parallelizationmentioning
confidence: 99%