2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon) 2019
DOI: 10.1109/comitcon.2019.8862199
|View full text |Cite
|
Sign up to set email alerts
|

Clustering Validation of CLARA and K-Means Using Silhouette & DUNN Measures on Iris Dataset

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
16
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 35 publications
(23 citation statements)
references
References 4 publications
1
16
0
1
Order By: Relevance
“…Figures 2,3,4,5,6,7 shows the wine dataset clustering for the algorithms k-means [20], nested mini-batch k-means [16], birch [26], mini-batch k-means [17], agglomerative clustering [15], DPCGS (our method) respectively. Figures 8,9,10,11,12,13 shows the wheat seeds dataset clustering for the algorithms k-means, nested mini-batch k-means, birch [26], mini-batch k-means, agglomerative clustering, DPCGS (our method) respectively. Figures 14,15,16,17,18,19,20 shows the iris dataset clustering for the algorithms k-means, nested minibatch k-means, birch, mini-batch k-means, agglomerative clustering, affinity propogation [3], DPCGS (our method) respectively.…”
Section: Results and Detailed Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Figures 2,3,4,5,6,7 shows the wine dataset clustering for the algorithms k-means [20], nested mini-batch k-means [16], birch [26], mini-batch k-means [17], agglomerative clustering [15], DPCGS (our method) respectively. Figures 8,9,10,11,12,13 shows the wheat seeds dataset clustering for the algorithms k-means, nested mini-batch k-means, birch [26], mini-batch k-means, agglomerative clustering, DPCGS (our method) respectively. Figures 14,15,16,17,18,19,20 shows the iris dataset clustering for the algorithms k-means, nested minibatch k-means, birch, mini-batch k-means, agglomerative clustering, affinity propogation [3], DPCGS (our method) respectively.…”
Section: Results and Detailed Analysismentioning
confidence: 99%
“…However, if the two algorithms' scores are not comparable depending on the internal evaluation tests, it is not better to judge the algorithm. As an internal indicator for estimating the clusters outcome, we use the Silhouette coefficient [10] to calculate the average distance between data points and other data points on the same cluster and the average distance between clusters. To verify the algorithm's validity, the external evaluation, considered the gold standard for the testing process, takes the external data.…”
Section: Introductionmentioning
confidence: 99%
“…10, it can be included that k = 3 is a good choice; however, the elbow method does not always provide accurate results, so another clustering metric has to be checked to confirm the obtained result. To give intuition about k , the silhouette analysis has been implemented [21]. This method is used to determine the degree of separation between clusters [21].…”
Section: The Unsupervised Learning and Clustering Resultsmentioning
confidence: 99%
“…To give intuition about k , the silhouette analysis has been implemented [21]. This method is used to determine the degree of separation between clusters [21]. This method is generally implemented by calculating the average distance from all data points in the same cluster d i and in the closest cluster c i .…”
Section: The Unsupervised Learning and Clustering Resultsmentioning
confidence: 99%
“…Distribusi tersebut dapat menggunakan sudut 0 0 , 45 0 , 90 0 , 135 0 [14]. Ciri yang diperoleh dianaliais menggunakan metode K-mean Clustering untuk diketahui pola masing-masing dokumen cetak [15]. Metode K-mean clustering dalam memodelkan suatu data ciri secara unsupervised.…”
Section: Tinjauan Pustakaunclassified