Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2017
DOI: 10.1007/s13369-017-2761-2
|View full text |Cite
|
Sign up to set email alerts
|

An Attributes Similarity-Based K-Medoids Clustering Technique in Data Mining

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(7 citation statements)
references
References 24 publications
0
4
0
Order By: Relevance
“…Practical data is contaminated by faults (errors) and missing values. Preprocessing assigns missing values, treats noise, normalizes, transforms, integrates, mitigates inconsistencies, and reduces and discretizes data [37]. Geospatial clustering datasets often include missing values.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Practical data is contaminated by faults (errors) and missing values. Preprocessing assigns missing values, treats noise, normalizes, transforms, integrates, mitigates inconsistencies, and reduces and discretizes data [37]. Geospatial clustering datasets often include missing values.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…The new medoid is determined by multiplying the intra-cluster distance matrix by rows and choosing the element with the smallest sum of pairwise distances. Algorithm 1 illustrates the local clustering process [21] [27]. Both local clustering algorithms are performed in parallel without coordination between nodes in this implementation.…”
Section: Step 3: First Stage Clustering a Local Clustering Using Entr...mentioning
confidence: 99%
“…Data reduction methods may be used to create a compressed version of the dataset that is significantly smaller in volume but retains the initial data's consistency [27][28][29][30][31][32][33]. Numerous data reduction techniques are described in the literature; these include sampling, data compression, and data discretization.…”
Section: B Context-aware Reductionmentioning
confidence: 99%
“…Second, some scholars introduce the Swarm Intelligence [ 30 , 31 ] and combine it with K-medoids to improve the global search capability and efficiency of the improved algorithms for samples. Arthur and Vassilvitskii [ 32 ] algorithmically fused the Swarm Algorithm with K-medoids.…”
Section: Literature Reviewmentioning
confidence: 99%