Existing deep learning image recognition models heavily rely on large amounts of manually labeled data by professionals, which is difficult to acquire, costly in terms of human labeling, and prone to subjective influence. Additionally, datasets in practical scenarios often exhibit imbalance, with a significant skew towards majority classes over minority classes. Compared to the majority classes, the rare minority classes are often more meaningful. However, most existing models tend to bias towards the majority classes during fitting, resulting in insufficient recognition accuracy for minority classes, which significantly hinders the widespread application of deep learning in practical image recognition. This paper introduces a new semi-supervised generative adversarial network, BASA-GAN, which can realize fair training of each class in unbalanced data set and improve the attention and recognition rate of minority classes. Firstly, we introduce a balanced generator module (SAG) that utilizes an autoencoder to learn features from all classes in the original labeled dataset and generates a balanced dataset through reconstruction. The reconstructed dataset is then fed into the generator to achieve fair ing for each class during training. In SAG we design a completely new class-balanced loss that encourages the autoencoder to generate a more balanced hidden space representation for different classes of data. Then, by incorporating the self-attention mechanism into the SAG module and the discriminator, we enhance the quality of the synthesized images and the discriminative power of the discriminator. Additionally, we introduce the spectral normalization into the semi-supervised model of BASA-GAN to address the instability issues during GAN training. Experimental results demonstrate that the SAG module achieves highly competitive performance in semi-supervised imbalanced image recognition tasks, not only improving the recognition accuracy for minority classes but also achieving fair training for each category in the dataset.