Evaluation of an optimal design method for a multilayer perceptron by using the design of experiments rithm. However, the learning algorithm is a unique factor. Another factor is the design of the MLP. Before training, the number of layers, the numbers of neurons in the hidden layers, and the training conditions such as the learning rates are determined. Trial-and-error and brute-force approaches, and network construction and pruning are all used as conventional design methods. It is diffi cult to apply these methods to MLPs with many layers because the space needed for their design parameters becomes huge. Another problem is that the approximate accuracy of MLPs with the same design parameters has variations due to the use of random numbers for the initial values of the connection weights. We need a design method which includes statistical analysis for MLPs.In previous work 3,4 we proposed a design method using the design of experiments (DOE), 5 which features effi cient experiments with an orthogonal array and quantitative analysis with analysis of variance (ANOVA). We demonstrated that an optimal design of fi ve-layer MLPs could be obtained using our design method. However, there is a problem in evaluating the proposed design method. The problem is a quantitative comparison between the proposed design method and other design methods. It is clear that the proposed designed method is better than trail-and-error or brute-force approaches. We focused on a genetic algorithmbased design method, which is a nonlinear optimization technique, and is expected to be as effi cient as the proposed method.Here, we evaluate the performance of the proposed design method through a comparison with a genetic algorithm-based design method. We target an optimal design of MLPs with six layers. When we deal with a few design parameters, the difference between the proposed design method and other methods is small. Our previous work implied that the accuracy of MLPs with more layers would become higher for the same training data. Therefore, we focus on MLPs with six layers. Moreover, we evaluate the proposed designed method in terms of calculating the amount of optimization. We use various types of training data because the performance of a trained MLP depends on the training data. We refer to the UCI machine learning Abstract We evaluated the performance of an optimal design method for a multilayer perceptron (MLP) by using the design of experiments (DOE). In our previous work, we proposed an optimal design method for MLPs in order to determine the optimal values of such parameters as the number of neurons in the hidden layers and the learning rates. In this article, we evaluate the performance of the proposed design method through a comparison with a genetic algorithm (GA)-based design method. We target an optimal design of MLPs with six layers. We also evaluate the proposed designed method in terms of calculating the amount of optimization. Through the above-mentioned evaluation and analysis, we aim at improving the proposed design ...