The development of lightweight networks makes neural networks more efficient to be widely applied to various tasks. Considering the deployment of hardware like edge devices and mobile phones, we prioritize lightweight networks. However, their accuracy has always lagged far behind SOTA networks. In this article, we present a simple yet effective activation function, called WReLU, to improve the performance of lightweight networks significantly by adding a residual spatial condition. Moreover, we use a strategy to switch activation functions after determining which convolutional layer to use. We perform experiments on ImageNet 2012 classification dataset in CPU, GPU, and edge devices. Experiments demonstrate that WReLU improves the accuracy of classification significantly. Meanwhile, our strategy balances the effect of additional parameters and multiply accumulate. Our method improves the accuracy of SqueezeNet and SqueezeNext by more than 5% without increasing extensive parameters and computation. For the lightweight network with a large number of parameters, such as MobileNet and ShuffleNet, there is also a significant improvement.Additionally, the inference speed of most lightweight networks using our WReLU strategy is almost the same as the baseline model on different platforms. Our approach not only ensures the practicability of the lightweight network but also improves its performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.