The backpropagation (BP) algorithm is a gradient-based algorithm used for training a feedforward neural network (FNN). Despite the fact that BP is still used today when FNNs are trained, it has some disadvantages, including the following: (i) it fails when non-differentiable functions are addressed, (ii) it can become trapped in local minima, and (iii) it has slow convergence. In order to solve some of these problems, metaheuristic algorithms have been used to train FNN. Although they have good exploration skills, they are not as good as gradient-based algorithms at exploitation tasks. The main contribution of this article lies in its application of novel memetic approaches based on the Gravitational Search Algorithm (GSA) and Chaotic Gravitational Search Algorithm (CGSA) algorithms, called respectively Memetic Gravitational Search Algorithm (MGSA) and Memetic Chaotic Gravitational Search Algorithm (MCGSA), to train FNNs in three classical benchmark problems: the XOR problem, the approximation of a continuous function, and classification tasks. The results show that both approaches constitute suitable alternatives for training FNNs, even improving on the performance of other state-of-the-art metaheuristic algorithms such as ParticleSwarm Optimization (PSO), the Genetic Algorithm (GA), the Adaptive Differential Evolution algorithm with Repaired crossover rate (Rcr-JADE), and the Covariance matrix learning and Bimodal distribution parameter setting Differential Evolution (COBIDE) algorithm. Swarm optimization, the genetic algorithm, the adaptive differential evolution algorithm with repaired crossover rate, and the covariance matrix learning and bimodal distribution parameter setting differential evolution algorithm.
Several common general purpose optimization algorithms are compared for finding A-and D-optimal designs for different types of statistical models of varying complexity, including high dimensional models with five and more factors. The algorithms of interest include exact methods, such as the interior point method, the Nelder-Mead method, the active set method, the sequential quadratic programming, and metaheuristic algorithms, such as particle swarm optimization, simulated annealing and genetic algorithms. Several simulations are performed, which provide general recommendations on the utility and performance of each method, including hybridized versions of metaheuristic algorithms for finding optimal experimental designs. A key result is that general-purpose optimization algorithms, both exact methods and metaheuristic algorithms, perform well for finding optimal approximate experimental designs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.