Incremental learning requires the model to learn new tasks without forgetting the learned tasks continuously. However, when the deep learning model learns new tasks, it will catastrophically forget the tasks it has learned before. Researchers have proposed methods to alleviate catastrophic forgetting; these methods only consider extracting features related to tasks learned before from learning samples, suppressing the model extracting features for unlearned tasks. As a result, when the model learns new tasks incrementally, the model needs to learn to extract the relevant features of the newly learned task quickly; this requires a significant change in the model's behavior of extracting features, which increases the learning difficulty. Therefore, the model is caught in the dilemma of reducing the learning rate to retain existing knowledge or increasing the learning rate to learn new knowledge quickly; we believe that introducing selfsupervised learning can alleviate this problem. Self-supervised learning methods learn to obtain supervised signals by customizing various pseudo-labels. Since the features extracted from the network model are based on self-supervised learning, independent of the specific subtasks of incremental learning, incorporating the supervised signal of self-supervised learning into the incremental learning process can help the model learn in incremental learning. The self-supervised learning process provides universal constraints, so that the model not only extracts features that are effective for the current learned task, but also allows the model to extract features that are suitable for other tasks that have not been learned, thereby allowing the network model parameters to iterate more smoothly and quickly to a point in the parameter space that satisfies the new and old tasks at the same time when learning a new task incrementally. The method of fusing selfsupervised and supervised learning signals significantly improve the adaptability of incremental learning without the need for additional labeled data. We verified that the introduction of self-supervised learning could significantly alleviate catastrophic forgetting in incremental learning algorithms based on sample replay by combining different databases and incremental learning model methods.INDEX TERMS incremental learning, self-supervised learning, deep learning