“…As it is shown in [22], a computational complexity R = R(N) is a number of floatpoint operations of addition/multiplication, where N is a number of an elements of input data vector. For one input data element N 1 a computational complexity of recurrent NN training algorithm [22] is equal: (i) R, = 1741 operations is the computational complexity of recurrent NN output value calculation according to (1-2): (ii) R,, = 124 operations is the computational complexity of sum-squared error calculation (3)(4) and the adaptive learning rate for neurons of output (5) and hidden (6) layers; (iii) R,,, = 1793 operations is the computational complexity of the synapses and the thresholds modification for all layers according to (7)(8)(9)(10)(11)(12)) on the stage of back information processing.…”