2008
DOI: 10.1016/j.patcog.2007.04.010
|View full text |Cite
|
Sign up to set email alerts
|

Robust path-based spectral clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
219
0
3

Year Published

2010
2010
2021
2021

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 499 publications
(234 citation statements)
references
References 13 publications
1
219
0
3
Order By: Relevance
“…Artificial datasets 1, 2, and 3 are selected. Dataset 1 is from [32], which consists of clusters with both convex and nonconvex shapes in a hierarchical structure; dataset 2 [33] consists of 3 nonconvex shape clusters; dataset 3 is a synthetic dataset consisting of 2 isotropic Gaussian blobs and 2 interleaving half-circles. By choosing an appropriate k ∈ 3, 7 , the intuitive visualization of hierarchical structure and clustering results is shown in Figure 1.…”
Section: Experiments Results and Discussionmentioning
confidence: 99%
“…Artificial datasets 1, 2, and 3 are selected. Dataset 1 is from [32], which consists of clusters with both convex and nonconvex shapes in a hierarchical structure; dataset 2 [33] consists of 3 nonconvex shape clusters; dataset 3 is a synthetic dataset consisting of 2 isotropic Gaussian blobs and 2 interleaving half-circles. By choosing an appropriate k ∈ 3, 7 , the intuitive visualization of hierarchical structure and clustering results is shown in Figure 1.…”
Section: Experiments Results and Discussionmentioning
confidence: 99%
“…We created an interlocked spirals dataset, shown in Fig. 2, which is considered to be a challenging benchmark for spectral clustering [22]. To alleviate this challenging clustering Fig.…”
Section: Results Of Comparison To Kernel-pcamentioning
confidence: 99%
“…Segmentation of asymmetric data with binary k-means in SC (a) and SCE (c) constructed feature spaces. An interlocked two spirals dataset is considered as a challenging benchmark for SC [22]. As can be seen in (c), SCE results in a better intercluster separation in the feature space.…”
Section: Results Of Comparison To Kernel-pcamentioning
confidence: 99%
See 1 more Smart Citation
“…A comprehensive review can be found in the papers [3], [4]. Unlike classical partitioning clustering algorithms, spectral clustering produces better clustering results on the data sets with highly nonlinear and elongated clusters [3], [5], such as circle or stick distribution. There are some well-known spectral clustering algorithms, such as Shi and Malik algorithm [1], Ng, Jordan and Weiss algorithm (NJW) [2], and others [5]- [8].…”
Section: Introductionmentioning
confidence: 99%