We present an exact analysis of learning a rule by on{line gradient descent in a two{layered neural network with adjustable hidden{to{output weights (backpropagation of error). Results are compared with the training of networks having the same architecture but xed weights in the second layer.
We study on-line learning of a linearly separable rule with a simple perceptron. Training utilizes a sequence of uncorrelated, randomly drawn N-dimensional input examples. In the thermodynamic limit the generalization error after training such examples with P can be calculated exactly. For the standard perceptron algorithm it decrease like (N/P)1/3 for large P/N, in contrast to the faster (N/P)1/2-behaviour of the so-called Hebbian learning. Furthermore, we show that a specific parameter-free on-line scheme, the AdaTron algorithm, gives an asymptotic (N/P)-decay of the generalization error. This coincides (up to a constant factor) with the bound for any training process based on random examples, including off-line learning. Simulations confirm our results.
Abstract. We introduce and discuss the application of statistical physics concepts in the context of on-line machine learning processes. The consideration of typical properties of very large systems allows to perfom averages over the randomness contained in the sequence of training data. It yields an exact mathematical description of the training dynamics in model scenarios. We present the basic concepts and results of the approach in terms of several examples, including the learning of linear separable rules, the training of multilayer neural networks, and Learning Vector Quantization.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.