2010
DOI: 10.1145/1824777.1824779
|View full text |Cite
|
Sign up to set email alerts
|

Clustering for metric and nonmetric distance measures

Abstract: We study a generalization of the k -median problem with respect to an arbitrary dissimilarity measure D. Given a finite set P of size n , our goal is to find a set C of size k such that the sum of errors D( P,C ) = ∑ p ∈ P min c ∈ C … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

3
242
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 89 publications
(245 citation statements)
references
References 27 publications
3
242
0
Order By: Relevance
“…However, the non-metric distances will lead to inconsistency and conflict, when the metric violate one or more metric axioms as in Refs. 3,4 . So it is necessary to change this metric distance.…”
Section: Derivation Of Iemmentioning
confidence: 99%
See 2 more Smart Citations
“…However, the non-metric distances will lead to inconsistency and conflict, when the metric violate one or more metric axioms as in Refs. 3,4 . So it is necessary to change this metric distance.…”
Section: Derivation Of Iemmentioning
confidence: 99%
“…In these applications, a notion of similarity is induced by computing kernel functions on arbitrary training sample pairs in input space. However, many similarity measures are developed based on metric distance under highdimensional setting and samples in the RKHS often violate one or more metric axioms 3,4 , this may impair the performance of machine learning algorithms. Therefore, how to choose a "good" similarity measure is one of the key concerns of these algorithms.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Previously, approximation bounds for Bregman clustering algorithms have been given by [CM08] and [ABS08]. Chadhuri and McGregor [CM08] consider the KL divergence, which is a particularly interesting case as the KL divergence between two members of the same exponential family is a Bregman divergence between their natural parameters [BMDG05].…”
Section: Introductionmentioning
confidence: 99%
“…Chadhuri and McGregor [CM08] consider the KL divergence, which is a particularly interesting case as the KL divergence between two members of the same exponential family is a Bregman divergence between their natural parameters [BMDG05]. Ackermann et al [ABS08] consider a statistically defined class of distortion measures which includes the KL divergence and other Bregman divergences. In both of these cases, the algorithms achieve (1 + ε)-approximation for arbitrary ε > 0.…”
Section: Introductionmentioning
confidence: 99%