2019
DOI: 10.1111/exsy.12353
|View full text |Cite
|
Sign up to set email alerts
|

Cuckoo and krill herd‐based k‐means++ hybrid algorithms for clustering

Abstract: Clustering algorithms can be optimized using nature‐inspired techniques. Many algorithms inspired by nature, namely, firefly algorithm, ant colony optimization algorithm, and so forth, have improved clustering results. k‐means is a popular clustering technique but has limitations of local optima, which have been overcome using its various hybrids. k‐means++ is a hybrid k‐means clustering algorithm that gives the procedure to initialize centre of the clusters. In the proposed work, hybrids of nature‐inspired te… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…It has the advantages of simple operation and high scalability and compressibility in the processing of large data sets. However, the algorithm also has certain limitations: (a) sensitive to cluster centres, (b) susceptible to outliers and (c) no criterion for the choice of k value to be equal (Aggarwal & Singh, 2019). This paper embodies the contribution of feature attributes to clustering by assigning different weights to each attribute in each cluster.…”
Section: Methodsmentioning
confidence: 99%
“…It has the advantages of simple operation and high scalability and compressibility in the processing of large data sets. However, the algorithm also has certain limitations: (a) sensitive to cluster centres, (b) susceptible to outliers and (c) no criterion for the choice of k value to be equal (Aggarwal & Singh, 2019). This paper embodies the contribution of feature attributes to clustering by assigning different weights to each attribute in each cluster.…”
Section: Methodsmentioning
confidence: 99%
“…Although the objective function incorporated into these algorithms is formulated to guarantee convergence to a local optimum, there is no guarantee that the convergence result will be accurate. This is because the algorithms can be trapped in a local optimum (Aggarwal & Singh, 2019; Qin et al, 2016).…”
Section: Pixel Classification Techniquesmentioning
confidence: 99%
“…Case 2 was a real-world dataset concerning credit risk assessment. When a borrower or trader is unwilling or unable to fulfil the contract conditions, the banks, investors, or counterparties may suffer monetary losses (Barboza et al, 2017); therefore, credit risk assessment is crucial to identifying the types of borrower or trader and predicting risks (Soui et al, 2019). In case 2, four credit risk assessment datasets were chosen to test the ES-NMPBFO performance.…”
Section: Casesmentioning
confidence: 99%
“…After clustering, the objects in the same group have high similarity, whereas those in different groups have low similarity (Saxena et al, 2017). Whilst a variety of clustering algorithms have been proposed in previous research, none of these clustering algorithms exhibits remarkable advantages in all contexts (Aggarwal & Singh, 2019; Dogan & Birant, 2021; Figueiredo et al, 2019). These algorithms have various design emphases, which depend on several factors, such as the computational complexity, convergence speed, the capability of pre‐setting important parameters (e.g., cluster number, radius), and insensitivity to outliers (Figueiredo et al, 2019; Nanda & Panda, 2014).…”
Section: Introductionmentioning
confidence: 99%