“…We observed that during the learning process, some neurons fail to learn a specific class, which affects the network performance resulting in false classifications. On the other hand, applying Progressive Pruning on the network results in a decrease of the average spiking network frequency, which will affect the energy consumption of our network positively, but may cause a frequency loss in multilayer networks, which is a known issue in Convolutional Spiking Neural Networks (CSNN) as described in [24]. Based on those observations, we propose a Dynamic Synaptic Weight Reinforcement (DSWR), which concerns the synapses that are conserved and considered as critical, to improve the network performance, by pushing the neurons that did not specialize in a specific pattern or class to do so, and keep the average spiking frequency of the network near the baseline.…”