We demonstrate that the fraction of pattern sets that can be stored in single-and hidden-layer perceptrons exhibits finite size scaling. This feature allows to estimate the critical storage capacity αc from simulations of relatively small systems. We illustrate this approach by determining αc, together with the finite size scaling exponent ν, for storing Gaussian patterns in committee and parity machines with binary couplings and up to K = 5 hidden units. 87.10.+e, 64.60.Cn, 05.50.+q, 02.70.Lq Finite size scaling (FSS) has proven to be a powerful method for analyzing phase transitions, which occur rigorously only in the thermodynamic limit, using simulations of systems of finite size [1]. In particular, it has become the prime method for determining numerical values of critical coupling parameters and exponents [2].Phase transitions are known to occur not only in condensed matter [3] and percolation systems [2], but also in random graphs [4], neural networks [5], and in algorithmic problems like search [6] and the satisfiability of random boolean expressions [7]. Heuristic derivations of FSS rely on the divergence of a correlation length at a critical point in the infinite system [2,8]. However, Kirkpatrick and Selman [9] have demonstrated recently that FSS can be used efficiently also in problems without any intrinsic length scales, like the connectivity of random graphs and the satisfiability of random boolean expressions. Abstract neural networks [5] are another class of systems without intrinsic length scale, and we will show in this contribution that FSS occurs at the transition from storable to unstorable pattern set sizes, and that it provides a powerful computational method for determining critical storage capacities.We will concentrate on particular feed-forward networks of the perceptron class, namely multi-layer perceptrons with N input neurons, K hidden units, and a regular tree-like connectivity (N mod K = 0), see Fig.