Particle swarm optimization (PSO) is one of the most well-regarded swarm-based algorithms in the literature. Although the original PSO has shown good optimization performance, it still severely suffers from premature convergence. As a result, many researchers have been modifying it resulting in a large number of PSO variants with either slightly or significantly better performance. Mainly, the standard PSO has been modified by four main strategies: modification of the PSO controlling parameters, hybridizing PSO with other well-known meta-heuristic algorithms such as genetic algorithm (GA) and differential evolution (DE), cooperation and multi-swarm techniques. This paper attempts to provide a comprehensive review of PSO, including the basic concepts of PSO, binary PSO, neighborhood topologies in PSO, recent and historical PSO variants, remarkable engineering applications of PSO, and its drawbacks. Moreover, this paper reviews recent studies that utilize PSO to solve feature selection problems. Finally, eight potential research directions that can help researchers further enhance the performance of PSO are provided.
Harris Hawks Optimization (HHO) algorithm is a new metaheuristic algorithm, inspired by the cooperative behavior and chasing style of Harris' Hawks in nature called surprise pounce. HHO demonstrated promising results compared to other optimization methods. However, HHO suffers from local optima and population diversity drawbacks. To overcome these limitations and adapt it to solve feature selection problems, a novel metaheuristic optimizer, namely Chaotic Harris Hawks Optimization (CHHO), is proposed. Two main improvements are suggested to the standard HHO algorithm. The first improvement is to apply the chaotic maps at the initialization phase of HHO to enhance the population diversity in the search space. The second improvement is to use the Simulated Annealing (SA) algorithm to the current best solution to improve HHO exploitation. To validate the performance of the proposed algorithm, CHHO was applied on 14 medical benchmark datasets from the UCI machine learning repository. The proposed CHHO was compared with the original HHO and some famous and recent metaheuristics algorithms, containing Grasshopper Optimization Algorithm (GOA), Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Butterfly Optimization Algorithm (BOA), and Ant Lion Optimizer (ALO). The used evaluation metrics include the number of selected features, classification accuracy, fitness values, Wilcoxon's statistical test (P-value), and convergence curve. Based on the achieved results, CHHO confirms its superiority over the standard HHO algorithm and the other optimization algorithms on the majority of the medical datasets.
Feature selection represents an essential pre-processing step for a wide range of Machine Learning approaches. Datasets typically contain irrelevant features that may negatively affect the classifier performance. A feature selector can reduce the number of these features and maximise the classifier accuracy. This paper proposes a Dynamic Butterfly Optimization Algorithm (DBOA) as an improved variant to Butterfly Optimization Algorithm (BOA) for feature selection problems. BOA represents one of the most recently proposed optimisation algorithms. BOA has demonstrated its ability to solve different types of problems with competitive results compared to other optimisation algorithms. However, the original BOA algorithm has problems when optimising high-dimensional problems. Such issues include stagnation into local optima and lacking solutions diversity during the optimisation process. To alleviate these weaknesses of the original BOA, two significant improvements are introduced in the original BOA: the development of a Local Search Algorithm Based on Mutation (LSAM) operator to avoid local optima problem and the use of LSAM to improve BOA solutions diversity. To demonstrate the efficiency and superiority of the proposed DBOA algorithm, 20 benchmark datasets from the UCI repository are employed. The classification accuracy, the fitness values, the number of selected features, the statistical results, and convergence curves are reported for DBOA and its competing algorithms. These results demonstrate that DBOA significantly outperforms the comparative algorithms on the majority of the used performance metrics. INDEX TERMS Butterfly optimisation algorithm, feature selection, local search algorithm based on mutation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.