The backpropagation algorithm is one of the most used tools for training artificial neural networks. However, this tool may be very slow in some practical applications. Many techniques have been discussed to speed up the performance of this algorithm and allow its use in an even broader range of applications. Although the backpropagation algorithm has been used for decades, we present here a set of computational results that suggest that by replacing bihyperbolic functions the backpropagation algorithm performs better than the traditional sigmoid functions. To the best of our knowledge, this finding was never previously published in the open literature. The efficiency and discrimination capacity of the proposed methodology are shown through a set of computational experiments, and compared with the traditional problems of the literature.The activation function embedded in the neurons of the network is one of the factors believed to be responsible for slowing this process. This happens because of the iterative nature of the network learning process. A slower calculation process slows down the processes involved. Shafie et al. (2012) showed that another reason for this may be the saturation of the activation function used in the hidden and output layers . When the saturation of a unit occurs, the descending gradient takes very small values, even when the output error is still high. There are many versions of the backpropagation algorithm aiming to optimize its performance such as steepest descendent backpropagation, momentum backpropagation, variable learning rate backpropagation, resilient backpropagation, conjugated gradient backpropagation, quasi-Newton algorithms, and Levenberg-Marquardt algorithm . There have been some advances but, as Wilamowski (2013) states, even the very powerful and fast Levenberg-Marquardt algorithm has the necessity of a matrix inversion, in which size is proportional to the number of patterns. Because of such constraint, the algorithm can be used only for small problems. This optimization is still an open question for the research field. The proposal presented in this paper uses a new activation function, the bihyperbolic function, which shows the necessary characteristics and is faster to compute than other sigmoid functions (Xavier, 2005).This paper is organized as follows. In Section 2, we briefly describe the structure of an ANN and the backpropagation algorithm. In Section 3, we present some tests executed to compare the performance of the new proposed activation function, the bihyperbolic function, in contrast with the most used one, the logistic function. The convergence and generalization aspects, as well as the processing time, were evaluated and obtained results were compared with classifiers presented in the literature. Conclusions are drawn in Section 4, where we explain the characteristics of this work that gave these stimulating results.
ANNsANNs work by building connections between the mathematical processing units called neurons. Knowledge is coded in the network by the ...