Abstract. Many real-world optimization problems have a large number of decision variables. In order to enhance the ability of DE for these problems, a novel local search operation was proposed. This operation combines orthogonal crossover and opposition-based learning strategy. During the evolution of DE, one individual was randomly chosen to undergo this operation. Thus it does not need much computing time, but can improve the search ability of DE. The performance of the proposed method is compared with two other competitive algorithms with benchmark problems. The compared results show the new method's effectiveness and efficiency.Keywords: Large scale optimization · Differential evolution · Orthogonal crossover · Quasi-opposition learning
IntroductionDifferential evolution (DE), proposed by Storn and Price, is a simple yet efficient algorithm for global optimization problems in continuous domain [1].It has been widely used in various applications [2].However, DE still suffers from the "curse of dimensionality", which implies that the performance of DE will deteriorate rapidly while the scale of the search space increases [3]. Thus DE usually fails to find the optimal solutions to large scale optimization problems. Much work have been tried to enhance the performance of DE for large scale optimization problems. One way to improve the performance of DE is by using new crossover operators. Noman et al. proposed a crossover-based adaptive local search operation to enhance the performa-nce of standard DE algorithm [2]. Wang et al. used an orthogonal crossover to enhance the search ability of DE [4]. These works have improved the performance of DE for low dimensional problems. However, when the scale size of problem grows up to 1000 or even more, they can not avoid being trapped into local minimum. Another promising approach to deal with large scale problems is opposition-based learning (OBL) [5,6]. OBL has been successfully applied to enhance the performance of DE. The key concept of OBL is to evaluate the current solutions and their opposite ones simultaneously. And the central opposition theorem has proved that the probability that the opposite of solution is closer to the global optimum is higher than the probability of a second random guess [7]. In recent years, one paradigm that received much attention is
Differential evolution (DE) is a robust algorithm of global optimization which has been used for solving many of the real-world applications since it was proposed. However, binomial crossover does not allow for a sufficiently effective search in local space. DE’s local search performance is therefore relatively poor. In particular, DE is applied to solve the complex optimization problem. In this case, inefficiency in local research seriously limits its overall performance. To overcome this disadvantage, this paper introduces a new local search scheme based on Hadamard matrix (HLS). The HLS improves the probability of finding the optimal solution through producing multiple offspring in the local space built by the target individual and its descendants. The HLS has been implemented in four classical DE algorithms and jDE, a variant of DE. The experiments are carried out on a set of widely used benchmark functions. For 20 benchmark problems, the four DE schemes using HLS have better results than the corresponding DE schemes, accounting for 80%, 75%, 65%, and 65% respectively. Also, the performance of jDE with HLS is better than that of jDE on 50% test problems. The experimental results and statistical analysis have revealed that HLS could effectively improve the overall performance of DE and jDE.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.