A variety of neural networks have been presented to deal with issues in deep learning in the last decades. Despite the prominent success achieved by the neural network, it still lacks theoretical guidance to design an efficient neural network model, and verifying the performance of a model needs excessive resources. Previous research studies have demonstrated that many existing models can be regarded as different numerical discretizations of differential equations. This connection sheds light on designing an effective recurrent neural network (RNN) by resorting to numerical analysis. Simple RNN is regarded as a discretisation of the forward Euler scheme. Considering the limited solution accuracy of the forward Euler methods, a Taylor‐type discrete scheme is presented with lower truncation error and a Taylor‐type RNN (T‐RNN) is designed with its guidance. Extensive experiments are conducted to evaluate its performance on statistical language models and emotion analysis tasks. The noticeable gains obtained by T‐RNN present its superiority and the feasibility of designing the neural network model using numerical methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.