ACO R is one of the most popular ant colony optimization algorithms for tackling continuous optimization problems. In this paper, we propose IACO R -LS, which is a variant of ACO R that uses local search and that features a growing solution archive. We experiment with Powell's conjugate directions set, Powell's BOBYQA, and Lin-Yu Tseng's Mtsls1 methods as local search procedures. Automatic parameter tuning results show that IACO R -LS with Mtsls1 (IACO R -Mtsls1) is not only a significant improvement over ACO R , but that it is also competitive with the state-of-theart algorithms described in a recent special issue of the Soft Computing journal. Further experimentation with IACO RMtsls1 on an extended benchmark functions suite, which includes functions from both the special issue of Soft Computing and the IEEE 2005 Congress on Evolutionary Computation, demonstrates its good performance on continuous optimization problems.
The development cycle of high-performance optimization algorithms requires the algorithm designer to make several design decisions. These decisions range from implementation details to the setting of parameter values for testing intermediate designs. Proper parameter setting can be crucial for the effective assessment of algorithmic components because a bad parameter setting can make a good algorithmic component perform poorly. This situation may lead the designer to discard promising components that just happened to be tested with bad parameter settings. Automatic parameter tuning techniques are being used by practitioners to obtain peak performance from already designed algorithms. However, automatic parameter tuning also plays a crucial role during the development cycle of optimization algorithms. In this paper, we present a case study of a tuning-in-the-loop approach for redesigning a particle swarm-based optimization algorithm for tackling large-scale continuous optimization problems. Rather than just presenting the final algorithm, we describe the whole redesign process. Finally, we study the scalability behavior of the final algorithm in the context of this special issue.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.