It is known that additive neural networks with a symmetric interconnection matrix are completely stable, i.e., each trajectory converges toward some equilibrium point. This paper addresses the fundamental question of robustness of complete stability of additive neural networks with respect to small perturbations of the nominal symmetric interconnections. It is shown that in the general case, complete stability is not robust. More precisely, the paper considers a class of neural networks, and gives a necessary and sufficient condition for the existence of Hopf bifurcations (HBs) at the equilibrium point at the origin, arbitrarily close to symmetry. Such HBs originate stable limit cycles and hence cause the loss of complete stability. Furthermore, the paper highlights situations where the HBs are particularly critical, in the sense that the amplitude of the limit cycles is very sensitive to errors due to tolerances in the electronic implementation of the neuron interconnections. It is shown that sensitivity is crucially dependent on the neuron nonlinearity, and it is also significantly influenced by the features of the interconnection matrix and the network dimension. Finally, limitations of the obtained results are discussed and hints for future work are given.