I n the paper, we are t o design the optimal learning rule for the Hopfield associative memories ( H A M ) based o n three well recognized criteria, that is, all desired attractors must be made not only isolately stable but also asymptotically stable, and the spurious stable states should be the fewest possible. To construct a satisfactory H A M , those criteria are crucial. I n the paper, we first analyze the real cause of the unsatisfactory performance of the Hebb rule and m a n y other existing learning rules designed f o r HAM's and then show that three criteria actually amount to widely expanding the basin of attraction around each desired attractor. One effective way to widely expand basins of attraction of all desired attractors is t o appropriately dig their respective steep kernal basin of attraction. For this, we introduce a concept called by the Hamming-stability. Surprisingly, we find that the Hamming-stability for all desired attractors can be reduced to a moderately expansive linear separability condition at each neuron and thus the well known Rosenblatt's perceptron learning rule is the right one f o r learning the Hamming-stability. Extensive and systematic experiments were conducted, convincingly showing that the proposed perceptron Hamming-stability learning rule did take a good care of three optimal criteria.