1 This is the reason for the diversity of terms used to denote neural networks: coneural connectionist models, distributed systems, neuromorphous systems, recognition machines, associative memory. Suppose that the parameter p meets the conditions 0 < p < 1]2. The equilibrium state v is stable and possesses the attraction domain with radius pn if for any vector x such that Iv -x I < pn, I I is a network with the input vector x falling in the equilibrium position v. Here I I denotes the Hamming distance.Definition 3. Absolute stable capacity pM (n, p).For any e > 0, provided that m < p0,(n, p), the probability P{all m vectors are equilibrium states with attraction domain of radius pn} > 1 -e -o(1)}.Definition 4. Relative stable capacity pas(np).For any e > 0, provided that m < pos(n,p), the mean M{the number of vectors from the set that correspond to equilibrium states with attraction domain of radius pn} > m(1 -e -o(1)}. For nonrecurrent neural networks, the notion of equilibrium states loses its meaning, and the capacity is defined as the maximum value of input-output pairs which can be implemented with the use of the model considered. In this case, we can define the same types of capacity as for recurrent networks.Suppose that H= is a class of neural networks with n inputs from the region X,, with a single output from the region Y,, = {-1,1}. We also assume that VC-dim(Hn) = d,, and m samples of input and output signals are chosen independently in accordance with the distribution Dn within X~ x Y,~.Definition 5. The sequence pt,(n) is referred to as the lower capacity of the class H,~ if, for any arbitrarily small e > 0 and n --4 cr for any distribution D= the probability P{H,, implements all the specified relations between inputs and outputs} --* 1 provided that m < (1 -r Definition 6. The sequence pu(n) is referred to as the upper capacity of the class Hn if, for any arbitrarily small e > 0 and n --* o% for any distribution D~, the probability P{H,, implements all the specified relations between inputs and outputs} ~ 1 provided that m > (1 + e)pu(n).Definition 7. The sequence p(n) is called a capacity if it simultaneously satisfies the definitions of the lower and upper capacities.Stable capacities for nonrecurrent networks can be defined by analogy with recurrent networks. The notion of the capacity for nonrecurrent networks is closely associated with the Vapnik-Chervonenkis dimension.