Mixture of experts (ME) is one of the most popular and interesting combining methods, which has great potential to improve performance in machine learning. ME is established based on the divide-and-conquer principle in which the problem space is divided between a few neural network experts, supervised by a gating network. In earlier works on ME, different strategies were developed to divide the problem space between the experts. As result, we have introduced a new method based on the principles of Particle Swarm Optimization (PSO) as a learning step in ME. In this paper, different aspects of the proposed method are compared with the common version of ME. The result carried out from this paper shows that the new method is robust to the variation of ensemble complexity in terms of the number of individual experts, and the number of hidden units.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.