Augmented weighted K-means grey wolf optimizer: An enhanced metaheuristic algorithm for data clustering problems
Manoharan Premkumar,
Garima Sinha,
Manjula Devi Ramasamy
et al.
Abstract:This study presents the K-means clustering-based grey wolf optimizer, a new algorithm intended to improve the optimization capabilities of the conventional grey wolf optimizer in order to address the problem of data clustering. The process that groups similar items within a dataset into non-overlapping groups. Grey wolf hunting behaviour served as the model for grey wolf optimizer, however, it frequently lacks the exploration and exploitation capabilities that are essential for efficient data clustering. This … Show more
“…Demirci et al [12] proposed an electrical search algorithm (ESA) based on the movement of electricity in high-resistive areas such as wood, glass, and gases and applied it to the clustering problem. Premkumar et al [26] focus on enhancing the grey wolf optimizer using a new weight factor and the concepts of the k means algorithm to increase variety and avoid premature convergence. In the following, the required algorithms will be outlined.…”
Clustering plays a crucial role in data mining and machine learning, with the primary objective being the identification of cohesive and distinct data groups, enabling the extraction of valuable information. However, clustering algorithms often encounter the challenge of getting trapped in local optima, hindering their ability to achieve optimal results. To address this issue, researchers have turned to using meta-heuristic algorithms. This article proposes an enhanced approach for clustering by combining the particle swarm optimization algorithm and the mountain gazelle algorithm. The utilization of this combined algorithm has shown superior performance compared to relying solely on the particle swarm algorithm. By using the strengths of both algorithms, our method overcomes the limitations posed by local optima, leading to more accurate and robust clustering results. The proposed algorithm utilizes the minimum fitness measure to locate the optimal centroid, which is determined based on three constraints: intra-cluster distance, inter-cluster distance, and cluster density. The data is then clustered using the optimal centroid corresponding to the minimal value of the fitness. The performance of our proposed approach has been evaluated on real-world datasets such as Iris, Wine, and Vowel. Our method and PSO and MGO algorithms have been compared on these datasets. The results of the experiments indicate that our proposed method outperforms the PSO and MGO algorithms in terms of clustering quality and convergence speed.
“…Demirci et al [12] proposed an electrical search algorithm (ESA) based on the movement of electricity in high-resistive areas such as wood, glass, and gases and applied it to the clustering problem. Premkumar et al [26] focus on enhancing the grey wolf optimizer using a new weight factor and the concepts of the k means algorithm to increase variety and avoid premature convergence. In the following, the required algorithms will be outlined.…”
Clustering plays a crucial role in data mining and machine learning, with the primary objective being the identification of cohesive and distinct data groups, enabling the extraction of valuable information. However, clustering algorithms often encounter the challenge of getting trapped in local optima, hindering their ability to achieve optimal results. To address this issue, researchers have turned to using meta-heuristic algorithms. This article proposes an enhanced approach for clustering by combining the particle swarm optimization algorithm and the mountain gazelle algorithm. The utilization of this combined algorithm has shown superior performance compared to relying solely on the particle swarm algorithm. By using the strengths of both algorithms, our method overcomes the limitations posed by local optima, leading to more accurate and robust clustering results. The proposed algorithm utilizes the minimum fitness measure to locate the optimal centroid, which is determined based on three constraints: intra-cluster distance, inter-cluster distance, and cluster density. The data is then clustered using the optimal centroid corresponding to the minimal value of the fitness. The performance of our proposed approach has been evaluated on real-world datasets such as Iris, Wine, and Vowel. Our method and PSO and MGO algorithms have been compared on these datasets. The results of the experiments indicate that our proposed method outperforms the PSO and MGO algorithms in terms of clustering quality and convergence speed.
“…In 2014, the Grey Wolf Optimizer (GWO) was proposed 25 , a population-based metaheuristic algorithm that mimics the social hierarchy and group hunting behavior of grey wolves. Owing to its inherent simplicity, fewer requirements for control parameters, and strong optimization performance, the GWO has found extensive applications across engineering problems ?, 26 , anomaly detection 27 , band selection 28 , path planning 29,30 , FS [31][32][33] , and other fields [34][35][36] . Wang et al 37 developed a role-oriented binary GWO.…”
Feature selection (FS) is a significant dimensionality reduction technique, which can effectively remove redundant features. Metaheuristic algorithms have been widely employed in FS, and have obtained satisfactory performance, among them, grey wolf optimizer (GWO) has received widespread attention. However, the GWO and its variants suffer from limited adaptability, poor diversity, and low accuracy when faced with high-dimensional data. The hybrid rice optimization (HRO) algorithm is an emerging metaheuristic algorithm derived from the hybrid heterosis and breeding mechanism in nature. It possesses a robust capacity to identify and converge towards optimal solutions. Therefore, a novel approach based on multi-strategy collaborative GWO combined with the HRO algorithm (HRO-GWO) for FS is proposed in this paper. The HRO-GWO algorithm is enhanced by four innovative strategies including dynamical regulation strategy and three search strategies. First, to improve the adaptability of GWO, the dynamical regulation strategy is devised for parameter optimization of GWO. Then, a multi-strategy co-evolution model inspired by HRO is designed, which utilizes neighborhood search, dual-crossover, and selfing techniques to bolster population diversity. Finally, the study develops a hybrid filter-wrapper framework to efficiently select pertinent and informative feature subsets, enhancing the classification performance while conserving time. The performance of HRO-GWO has been rigorously assessed across benchmark functions and the effectiveness of the proposed framework has been evaluated on high-dimensional biomedical datasets. Our experimental findings demonstrate that the approach on the basis of HRO-GWO outperforms state-of-the-art methods.
This paper reviews the integration of Q‐learning with meta‐heuristic algorithms (QLMA) over the last 20 years, highlighting its success in solving complex optimization problems. We focus on key aspects of QLMA, including parameter adaptation, operator selection, and balancing global exploration with local exploitation. QLMA has become a leading solution in industries like energy, power systems, and engineering, addressing a range of mathematical challenges. Looking forward, we suggest further exploration of meta‐heuristic integration, transfer learning strategies, and techniques to reduce state space.This article is categorized under:
Technologies > Computational Intelligence
Technologies > Artificial Intelligence
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.